15896 1727203853.26403: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-bGV executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15896 1727203853.27237: Added group all to inventory 15896 1727203853.27239: Added group ungrouped to inventory 15896 1727203853.27243: Group all now contains ungrouped 15896 1727203853.27246: Examining possible inventory source: /tmp/network-zt6/inventory-rSl.yml 15896 1727203853.59692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15896 1727203853.59843: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15896 1727203853.59981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15896 1727203853.60042: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15896 1727203853.60524: Loaded config def from plugin (inventory/script) 15896 1727203853.60526: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15896 1727203853.60569: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15896 1727203853.60658: Loaded config def from plugin (inventory/yaml) 15896 1727203853.60663: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15896 1727203853.61084: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15896 1727203853.61936: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15896 1727203853.61940: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15896 1727203853.61943: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15896 1727203853.61949: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15896 1727203853.61953: Loading data from /tmp/network-zt6/inventory-rSl.yml 15896 1727203853.62227: /tmp/network-zt6/inventory-rSl.yml was not parsable by auto 15896 1727203853.62297: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15896 1727203853.62336: Loading data from /tmp/network-zt6/inventory-rSl.yml 15896 1727203853.62610: group all already in inventory 15896 1727203853.62617: set inventory_file for managed-node1 15896 1727203853.62622: set inventory_dir for managed-node1 15896 1727203853.62623: Added host managed-node1 to inventory 15896 1727203853.62625: Added host managed-node1 to group all 15896 1727203853.62626: set ansible_host for managed-node1 15896 1727203853.62627: set ansible_ssh_extra_args for managed-node1 15896 1727203853.62630: set inventory_file for managed-node2 15896 1727203853.62633: set inventory_dir for managed-node2 15896 1727203853.62633: Added host managed-node2 to inventory 15896 1727203853.62635: Added host managed-node2 to group all 15896 1727203853.62636: set ansible_host for managed-node2 15896 1727203853.62637: set ansible_ssh_extra_args for managed-node2 15896 1727203853.62639: set inventory_file for managed-node3 15896 1727203853.62641: set inventory_dir for managed-node3 15896 1727203853.62642: Added host managed-node3 to inventory 15896 1727203853.62643: Added host managed-node3 to group all 15896 1727203853.62644: set ansible_host for managed-node3 15896 1727203853.62645: set ansible_ssh_extra_args for managed-node3 15896 1727203853.62647: Reconcile groups and hosts in inventory. 15896 1727203853.62651: Group ungrouped now contains managed-node1 15896 1727203853.62654: Group ungrouped now contains managed-node2 15896 1727203853.62656: Group ungrouped now contains managed-node3 15896 1727203853.62734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15896 1727203853.63316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15896 1727203853.63369: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15896 1727203853.63397: Loaded config def from plugin (vars/host_group_vars) 15896 1727203853.63400: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15896 1727203853.63406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15896 1727203853.63414: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15896 1727203853.63455: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15896 1727203853.64319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203853.64530: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15896 1727203853.64570: Loaded config def from plugin (connection/local) 15896 1727203853.64574: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15896 1727203853.66108: Loaded config def from plugin (connection/paramiko_ssh) 15896 1727203853.66112: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15896 1727203853.68043: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15896 1727203853.68200: Loaded config def from plugin (connection/psrp) 15896 1727203853.68203: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15896 1727203853.69659: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15896 1727203853.69700: Loaded config def from plugin (connection/ssh) 15896 1727203853.69704: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15896 1727203853.76419: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15896 1727203853.76579: Loaded config def from plugin (connection/winrm) 15896 1727203853.76583: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15896 1727203853.76614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15896 1727203853.76801: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15896 1727203853.76871: Loaded config def from plugin (shell/cmd) 15896 1727203853.76873: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15896 1727203853.77007: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15896 1727203853.77072: Loaded config def from plugin (shell/powershell) 15896 1727203853.77074: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15896 1727203853.77252: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15896 1727203853.77615: Loaded config def from plugin (shell/sh) 15896 1727203853.77617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15896 1727203853.77770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15896 1727203853.78004: Loaded config def from plugin (become/runas) 15896 1727203853.78006: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15896 1727203853.78424: Loaded config def from plugin (become/su) 15896 1727203853.78427: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15896 1727203853.78671: Loaded config def from plugin (become/sudo) 15896 1727203853.78673: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15896 1727203853.78865: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 15896 1727203853.79608: in VariableManager get_vars() 15896 1727203853.79629: done with get_vars() 15896 1727203853.79767: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15896 1727203853.86585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15896 1727203853.86801: in VariableManager get_vars() 15896 1727203853.86806: done with get_vars() 15896 1727203853.86809: variable 'playbook_dir' from source: magic vars 15896 1727203853.86810: variable 'ansible_playbook_python' from source: magic vars 15896 1727203853.86810: variable 'ansible_config_file' from source: magic vars 15896 1727203853.86811: variable 'groups' from source: magic vars 15896 1727203853.86812: variable 'omit' from source: magic vars 15896 1727203853.86812: variable 'ansible_version' from source: magic vars 15896 1727203853.86813: variable 'ansible_check_mode' from source: magic vars 15896 1727203853.86814: variable 'ansible_diff_mode' from source: magic vars 15896 1727203853.86815: variable 'ansible_forks' from source: magic vars 15896 1727203853.86815: variable 'ansible_inventory_sources' from source: magic vars 15896 1727203853.86816: variable 'ansible_skip_tags' from source: magic vars 15896 1727203853.86817: variable 'ansible_limit' from source: magic vars 15896 1727203853.86817: variable 'ansible_run_tags' from source: magic vars 15896 1727203853.86818: variable 'ansible_verbosity' from source: magic vars 15896 1727203853.86982: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 15896 1727203853.89471: in VariableManager get_vars() 15896 1727203853.89493: done with get_vars() 15896 1727203853.89503: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 15896 1727203853.91525: in VariableManager get_vars() 15896 1727203853.91540: done with get_vars() 15896 1727203853.91549: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15896 1727203853.91889: in VariableManager get_vars() 15896 1727203853.91908: done with get_vars() 15896 1727203853.92170: in VariableManager get_vars() 15896 1727203853.92388: done with get_vars() 15896 1727203853.92397: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15896 1727203853.92472: in VariableManager get_vars() 15896 1727203853.92504: done with get_vars() 15896 1727203853.93153: in VariableManager get_vars() 15896 1727203853.93171: done with get_vars() 15896 1727203853.93242: variable 'omit' from source: magic vars 15896 1727203853.93265: variable 'omit' from source: magic vars 15896 1727203853.93302: in VariableManager get_vars() 15896 1727203853.93313: done with get_vars() 15896 1727203853.93478: in VariableManager get_vars() 15896 1727203853.93491: done with get_vars() 15896 1727203853.93526: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15896 1727203853.94169: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15896 1727203853.94705: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15896 1727203854.00524: in VariableManager get_vars() 15896 1727203854.00546: done with get_vars() 15896 1727203854.01578: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15896 1727203854.01827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203854.05790: in VariableManager get_vars() 15896 1727203854.05811: done with get_vars() 15896 1727203854.05819: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15896 1727203854.06119: in VariableManager get_vars() 15896 1727203854.06139: done with get_vars() 15896 1727203854.06266: in VariableManager get_vars() 15896 1727203854.06388: done with get_vars() 15896 1727203854.06996: in VariableManager get_vars() 15896 1727203854.07014: done with get_vars() 15896 1727203854.07019: variable 'omit' from source: magic vars 15896 1727203854.07031: variable 'omit' from source: magic vars 15896 1727203854.07335: variable 'controller_profile' from source: play vars 15896 1727203854.07487: in VariableManager get_vars() 15896 1727203854.07501: done with get_vars() 15896 1727203854.07528: in VariableManager get_vars() 15896 1727203854.07544: done with get_vars() 15896 1727203854.07577: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15896 1727203854.07913: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15896 1727203854.08231: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15896 1727203854.08952: in VariableManager get_vars() 15896 1727203854.09046: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203854.14520: in VariableManager get_vars() 15896 1727203854.14543: done with get_vars() 15896 1727203854.14547: variable 'omit' from source: magic vars 15896 1727203854.14558: variable 'omit' from source: magic vars 15896 1727203854.14668: in VariableManager get_vars() 15896 1727203854.14688: done with get_vars() 15896 1727203854.14813: in VariableManager get_vars() 15896 1727203854.14832: done with get_vars() 15896 1727203854.14863: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15896 1727203854.15423: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15896 1727203854.15924: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15896 1727203854.16730: in VariableManager get_vars() 15896 1727203854.16754: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203854.21023: in VariableManager get_vars() 15896 1727203854.21050: done with get_vars() 15896 1727203854.21056: variable 'omit' from source: magic vars 15896 1727203854.21070: variable 'omit' from source: magic vars 15896 1727203854.21419: in VariableManager get_vars() 15896 1727203854.21464: done with get_vars() 15896 1727203854.21593: in VariableManager get_vars() 15896 1727203854.21620: done with get_vars() 15896 1727203854.21650: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15896 1727203854.22162: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15896 1727203854.22340: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15896 1727203854.23782: in VariableManager get_vars() 15896 1727203854.23812: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203854.28940: in VariableManager get_vars() 15896 1727203854.28972: done with get_vars() 15896 1727203854.29241: variable 'omit' from source: magic vars 15896 1727203854.29270: variable 'omit' from source: magic vars 15896 1727203854.29307: in VariableManager get_vars() 15896 1727203854.29330: done with get_vars() 15896 1727203854.29467: in VariableManager get_vars() 15896 1727203854.29492: done with get_vars() 15896 1727203854.29521: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15896 1727203854.29782: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15896 1727203854.30072: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15896 1727203854.31157: in VariableManager get_vars() 15896 1727203854.31293: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203854.35724: in VariableManager get_vars() 15896 1727203854.35874: done with get_vars() 15896 1727203854.35887: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 15896 1727203854.37021: in VariableManager get_vars() 15896 1727203854.37051: done with get_vars() 15896 1727203854.37164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15896 1727203854.37180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15896 1727203854.38187: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15896 1727203854.38661: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15896 1727203854.38664: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 15896 1727203854.38698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15896 1727203854.38722: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15896 1727203854.39094: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15896 1727203854.39153: Loaded config def from plugin (callback/default) 15896 1727203854.39156: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15896 1727203854.40754: Loaded config def from plugin (callback/junit) 15896 1727203854.40757: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15896 1727203854.40816: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15896 1727203854.40905: Loaded config def from plugin (callback/minimal) 15896 1727203854.40907: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15896 1727203854.40947: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15896 1727203854.41016: Loaded config def from plugin (callback/tree) 15896 1727203854.41018: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15896 1727203854.41145: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15896 1727203854.41148: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_nm.yml ******************************************** 2 plays in /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 15896 1727203854.41186: in VariableManager get_vars() 15896 1727203854.41201: done with get_vars() 15896 1727203854.41207: in VariableManager get_vars() 15896 1727203854.41215: done with get_vars() 15896 1727203854.41225: variable 'omit' from source: magic vars 15896 1727203854.41266: in VariableManager get_vars() 15896 1727203854.41284: done with get_vars() 15896 1727203854.41312: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with nm as provider] ***** 15896 1727203854.42257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15896 1727203854.42340: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15896 1727203854.42595: getting the remaining hosts for this loop 15896 1727203854.42597: done getting the remaining hosts for this loop 15896 1727203854.42600: getting the next task for host managed-node1 15896 1727203854.42604: done getting next task for host managed-node1 15896 1727203854.42606: ^ task is: TASK: Gathering Facts 15896 1727203854.42608: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203854.42610: getting variables 15896 1727203854.42611: in VariableManager get_vars() 15896 1727203854.42620: Calling all_inventory to load vars for managed-node1 15896 1727203854.42623: Calling groups_inventory to load vars for managed-node1 15896 1727203854.42625: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203854.42637: Calling all_plugins_play to load vars for managed-node1 15896 1727203854.42648: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203854.42652: Calling groups_plugins_play to load vars for managed-node1 15896 1727203854.42690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203854.42865: done with get_vars() 15896 1727203854.42872: done getting variables 15896 1727203854.42992: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 Tuesday 24 September 2024 14:50:54 -0400 (0:00:00.019) 0:00:00.019 ***** 15896 1727203854.43014: entering _queue_task() for managed-node1/gather_facts 15896 1727203854.43015: Creating lock for gather_facts 15896 1727203854.43613: worker is 1 (out of 1 available) 15896 1727203854.43623: exiting _queue_task() for managed-node1/gather_facts 15896 1727203854.43636: done queuing things up, now waiting for results queue to drain 15896 1727203854.43638: waiting for pending results... 15896 1727203854.43817: running TaskExecutor() for managed-node1/TASK: Gathering Facts 15896 1727203854.43982: in run() - task 028d2410-947f-fb83-b6ad-0000000001bc 15896 1727203854.43985: variable 'ansible_search_path' from source: unknown 15896 1727203854.43990: calling self._execute() 15896 1727203854.44054: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203854.44067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203854.44082: variable 'omit' from source: magic vars 15896 1727203854.44196: variable 'omit' from source: magic vars 15896 1727203854.44236: variable 'omit' from source: magic vars 15896 1727203854.44281: variable 'omit' from source: magic vars 15896 1727203854.44333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203854.44378: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203854.44436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203854.44439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203854.44446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203854.44485: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203854.44493: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203854.44499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203854.44653: Set connection var ansible_shell_type to sh 15896 1727203854.44656: Set connection var ansible_connection to ssh 15896 1727203854.44659: Set connection var ansible_shell_executable to /bin/sh 15896 1727203854.44663: Set connection var ansible_pipelining to False 15896 1727203854.44664: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203854.44667: Set connection var ansible_timeout to 10 15896 1727203854.44678: variable 'ansible_shell_executable' from source: unknown 15896 1727203854.44685: variable 'ansible_connection' from source: unknown 15896 1727203854.44693: variable 'ansible_module_compression' from source: unknown 15896 1727203854.44699: variable 'ansible_shell_type' from source: unknown 15896 1727203854.44705: variable 'ansible_shell_executable' from source: unknown 15896 1727203854.44711: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203854.44718: variable 'ansible_pipelining' from source: unknown 15896 1727203854.44762: variable 'ansible_timeout' from source: unknown 15896 1727203854.44765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203854.44913: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203854.44928: variable 'omit' from source: magic vars 15896 1727203854.44936: starting attempt loop 15896 1727203854.44942: running the handler 15896 1727203854.44963: variable 'ansible_facts' from source: unknown 15896 1727203854.45036: _low_level_execute_command(): starting 15896 1727203854.45039: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203854.45964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203854.45986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203854.46093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203854.48014: stdout chunk (state=3): >>>/root <<< 15896 1727203854.48071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203854.48086: stdout chunk (state=3): >>><<< 15896 1727203854.48098: stderr chunk (state=3): >>><<< 15896 1727203854.48144: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203854.48333: _low_level_execute_command(): starting 15896 1727203854.48341: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491 `" && echo ansible-tmp-1727203854.4823227-15992-47039306806491="` echo /root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491 `" ) && sleep 0' 15896 1727203854.49081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203854.49098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203854.49112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203854.49217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203854.49237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203854.49294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203854.49399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203854.51617: stdout chunk (state=3): >>>ansible-tmp-1727203854.4823227-15992-47039306806491=/root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491 <<< 15896 1727203854.51695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203854.51844: stderr chunk (state=3): >>><<< 15896 1727203854.51848: stdout chunk (state=3): >>><<< 15896 1727203854.51897: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203854.4823227-15992-47039306806491=/root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203854.51984: variable 'ansible_module_compression' from source: unknown 15896 1727203854.52185: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15896 1727203854.52189: ANSIBALLZ: Acquiring lock 15896 1727203854.52191: ANSIBALLZ: Lock acquired: 140082272719056 15896 1727203854.52193: ANSIBALLZ: Creating module 15896 1727203855.04111: ANSIBALLZ: Writing module into payload 15896 1727203855.04370: ANSIBALLZ: Writing module 15896 1727203855.04393: ANSIBALLZ: Renaming module 15896 1727203855.04399: ANSIBALLZ: Done creating module 15896 1727203855.04513: variable 'ansible_facts' from source: unknown 15896 1727203855.04532: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203855.04583: _low_level_execute_command(): starting 15896 1727203855.04587: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15896 1727203855.05934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203855.06023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203855.06026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203855.06029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203855.06031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203855.06043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203855.06264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203855.06384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203855.08163: stdout chunk (state=3): >>>PLATFORM <<< 15896 1727203855.08235: stdout chunk (state=3): >>>Linux <<< 15896 1727203855.08259: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 15896 1727203855.08277: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 15896 1727203855.08647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203855.08650: stdout chunk (state=3): >>><<< 15896 1727203855.08653: stderr chunk (state=3): >>><<< 15896 1727203855.08655: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203855.08662 [managed-node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 15896 1727203855.08665: _low_level_execute_command(): starting 15896 1727203855.08666: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 15896 1727203855.08911: Sending initial data 15896 1727203855.08915: Sent initial data (1181 bytes) 15896 1727203855.10007: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203855.10021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203855.10255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203855.10302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203855.10379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203855.14225: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 15896 1727203855.15084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203855.15087: stdout chunk (state=3): >>><<< 15896 1727203855.15090: stderr chunk (state=3): >>><<< 15896 1727203855.15092: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203855.15094: variable 'ansible_facts' from source: unknown 15896 1727203855.15096: variable 'ansible_facts' from source: unknown 15896 1727203855.15098: variable 'ansible_module_compression' from source: unknown 15896 1727203855.15100: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15896 1727203855.15135: variable 'ansible_facts' from source: unknown 15896 1727203855.15350: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/AnsiballZ_setup.py 15896 1727203855.15578: Sending initial data 15896 1727203855.15582: Sent initial data (153 bytes) 15896 1727203855.16243: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203855.16263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203855.16330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203855.16396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203855.16412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203855.16552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203855.16696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203855.19005: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203855.19128: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203855.19229: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpcdlypaye /root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/AnsiballZ_setup.py <<< 15896 1727203855.19232: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/AnsiballZ_setup.py" <<< 15896 1727203855.19305: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpcdlypaye" to remote "/root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/AnsiballZ_setup.py" <<< 15896 1727203855.21241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203855.21246: stdout chunk (state=3): >>><<< 15896 1727203855.21248: stderr chunk (state=3): >>><<< 15896 1727203855.21250: done transferring module to remote 15896 1727203855.21252: _low_level_execute_command(): starting 15896 1727203855.21254: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/ /root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/AnsiballZ_setup.py && sleep 0' 15896 1727203855.21835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203855.21893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203855.21971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203855.22001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203855.22015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203855.22103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203855.24441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203855.24445: stdout chunk (state=3): >>><<< 15896 1727203855.24447: stderr chunk (state=3): >>><<< 15896 1727203855.24462: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203855.24469: _low_level_execute_command(): starting 15896 1727203855.24480: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/AnsiballZ_setup.py && sleep 0' 15896 1727203855.25093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203855.25110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203855.25135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203855.25153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203855.25171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203855.25248: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203855.25287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203855.25320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203855.25473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203855.27955: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15896 1727203855.28208: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # <<< 15896 1727203855.28291: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 15896 1727203855.28335: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.28355: stdout chunk (state=3): >>>import '_codecs' # <<< 15896 1727203855.28459: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15896 1727203855.28486: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df9b84d0> <<< 15896 1727203855.28573: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df987b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 15896 1727203855.28593: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df9baa50> import '_signal' # import '_abc' # <<< 15896 1727203855.28732: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 15896 1727203855.28858: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # <<< 15896 1727203855.28896: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15896 1727203855.28929: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 15896 1727203855.28956: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15896 1727203855.28971: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 15896 1727203855.29105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 15896 1727203855.29117: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df769130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df76a060> <<< 15896 1727203855.29181: stdout chunk (state=3): >>>import 'site' # <<< 15896 1727203855.29388: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15896 1727203855.29863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15896 1727203855.29889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15896 1727203855.29947: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15896 1727203855.29951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.29991: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15896 1727203855.30058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15896 1727203855.30093: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15896 1727203855.30185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15896 1727203855.30224: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a7e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15896 1727203855.30227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15896 1727203855.30270: stdout chunk (state=3): >>>import '_operator' # <<< 15896 1727203855.30297: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a7f50> <<< 15896 1727203855.30321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15896 1727203855.30355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15896 1727203855.30401: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15896 1727203855.30485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.30740: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7df890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7dff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7bfb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7bd280> <<< 15896 1727203855.30874: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a5040> <<< 15896 1727203855.30913: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15896 1727203855.30956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15896 1727203855.30982: stdout chunk (state=3): >>>import '_sre' # <<< 15896 1727203855.31010: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15896 1727203855.31051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15896 1727203855.31095: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 15896 1727203855.31107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15896 1727203855.31156: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df803770> <<< 15896 1727203855.31190: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df802390> <<< 15896 1727203855.31246: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 15896 1727203855.31250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 15896 1727203855.31288: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7be120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a6900> <<< 15896 1727203855.31400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df834830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a42c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15896 1727203855.31637: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df834ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df834b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df834f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a2de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df835640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df835310> import 'importlib.machinery' # <<< 15896 1727203855.31645: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 15896 1727203855.31723: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df836510> <<< 15896 1727203855.31725: stdout chunk (state=3): >>>import 'importlib.util' # <<< 15896 1727203855.31728: stdout chunk (state=3): >>>import 'runpy' # <<< 15896 1727203855.31733: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15896 1727203855.31751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15896 1727203855.31784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 15896 1727203855.31800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df84c710> <<< 15896 1727203855.31982: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df84ddc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df84ec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.31987: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df84f290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df84e1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15896 1727203855.31999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15896 1727203855.32040: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.32063: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df84fd10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df84f440> <<< 15896 1727203855.32108: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df836480> <<< 15896 1727203855.32179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15896 1727203855.32183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15896 1727203855.32278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df543c80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15896 1727203855.32297: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df56c7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56c530> <<< 15896 1727203855.32317: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df56c710> <<< 15896 1727203855.32350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15896 1727203855.32442: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.33011: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df56d0a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df56da60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56c950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df541e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56ee40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56db80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df836c30> <<< 15896 1727203855.33017: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15896 1727203855.33109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.33129: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15896 1727203855.33182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15896 1727203855.33216: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df5971d0> <<< 15896 1727203855.33293: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15896 1727203855.33312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.33420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df5bf560> <<< 15896 1727203855.33449: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15896 1727203855.33515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15896 1727203855.33654: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df61c2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15896 1727203855.33690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15896 1727203855.33807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15896 1727203855.33907: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df61ea50> <<< 15896 1727203855.34025: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df61c410> <<< 15896 1727203855.34084: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df5e5340> <<< 15896 1727203855.34141: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def29430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df5be360> <<< 15896 1727203855.34150: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56fda0> <<< 15896 1727203855.34459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15896 1727203855.34464: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f22df5be480> <<< 15896 1727203855.34858: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_l2avm4ww/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 15896 1727203855.35204: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15896 1727203855.35289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15896 1727203855.35325: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def8f110> <<< 15896 1727203855.35331: stdout chunk (state=3): >>>import '_typing' # <<< 15896 1727203855.35638: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def6e000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def6d160> # zipimport: zlib available <<< 15896 1727203855.35843: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 15896 1727203855.38014: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.40057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 15896 1727203855.40067: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def8cfe0> <<< 15896 1727203855.40090: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 15896 1727203855.40095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.40122: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15896 1727203855.40138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15896 1727203855.40163: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 15896 1727203855.40166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15896 1727203855.40212: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.40215: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22defbea80> <<< 15896 1727203855.40264: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22defbe810> <<< 15896 1727203855.40308: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22defbe120> <<< 15896 1727203855.40328: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15896 1727203855.40394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22defbe8a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def8fda0> <<< 15896 1727203855.40407: stdout chunk (state=3): >>>import 'atexit' # <<< 15896 1727203855.40445: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.40448: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.40450: stdout chunk (state=3): >>>import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22defbf7a0> <<< 15896 1727203855.40592: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22defbf9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 15896 1727203855.40648: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22defbff20> <<< 15896 1727203855.40665: stdout chunk (state=3): >>>import 'pwd' # <<< 15896 1727203855.40685: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15896 1727203855.40728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15896 1727203855.40779: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee29c70> <<< 15896 1727203855.40816: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.40821: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee2b890> <<< 15896 1727203855.41100: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2c260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2d160> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15896 1727203855.41171: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2fe30> <<< 15896 1727203855.41219: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee2ff80> <<< 15896 1727203855.41246: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2e120> <<< 15896 1727203855.41274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15896 1727203855.41319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15896 1727203855.41339: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15896 1727203855.41364: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15896 1727203855.41550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15896 1727203855.41595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15896 1727203855.41598: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee37ec0> <<< 15896 1727203855.41621: stdout chunk (state=3): >>>import '_tokenize' # <<< 15896 1727203855.41706: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee36990> <<< 15896 1727203855.41746: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee366f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 15896 1727203855.41756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15896 1727203855.41867: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee36c60> <<< 15896 1727203855.41903: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2e600> <<< 15896 1727203855.41945: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.41975: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee7bf50> <<< 15896 1727203855.41987: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee7c260> <<< 15896 1727203855.42053: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15896 1727203855.42070: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 15896 1727203855.42096: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.42117: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee7dd00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee7dac0> <<< 15896 1727203855.42130: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15896 1727203855.42240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.42251: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee80260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee7e3c0> <<< 15896 1727203855.42288: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15896 1727203855.42335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.42350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15896 1727203855.42385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15896 1727203855.42580: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee83a40> <<< 15896 1727203855.42637: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee80410> <<< 15896 1727203855.42728: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.42731: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee84890> <<< 15896 1727203855.42765: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.42771: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee84a70> <<< 15896 1727203855.42832: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.42836: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee843e0> <<< 15896 1727203855.42849: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee7c410> <<< 15896 1727203855.42874: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 15896 1727203855.42898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15896 1727203855.42933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15896 1727203855.42969: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.43000: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ded103b0> <<< 15896 1727203855.43394: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ded113d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee86b40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee87ef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee867b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15896 1727203855.43474: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.43608: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.43612: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.43631: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 15896 1727203855.43644: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.43654: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 15896 1727203855.43681: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.43861: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.44046: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.44986: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.45897: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15896 1727203855.45939: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 15896 1727203855.45957: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15896 1727203855.45960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.46044: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ded15610> <<< 15896 1727203855.46188: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded16540> <<< 15896 1727203855.46194: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee854f0> <<< 15896 1727203855.46276: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 15896 1727203855.46312: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 15896 1727203855.46478: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.46572: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.46820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15896 1727203855.46824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15896 1727203855.46843: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded162a0> <<< 15896 1727203855.46851: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.47602: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.48374: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.48482: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.48588: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15896 1727203855.48594: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.48647: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.48689: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15896 1727203855.48710: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.48800: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.48923: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15896 1727203855.48964: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203855.48984: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15896 1727203855.49064: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 15896 1727203855.49080: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.49462: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.49838: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15896 1727203855.49949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 15896 1727203855.50063: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded17680> # zipimport: zlib available <<< 15896 1727203855.50181: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.50285: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 15896 1727203855.50383: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 15896 1727203855.50401: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.50455: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15896 1727203855.50482: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.50544: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.50613: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.50700: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.50803: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15896 1727203855.50881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.51006: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.51081: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ded21ee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded1f200> <<< 15896 1727203855.51185: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15896 1727203855.51233: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.51329: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.51366: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.51435: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 15896 1727203855.51440: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.51686: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15896 1727203855.51762: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee0a9c0> <<< 15896 1727203855.51834: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22deefe690> <<< 15896 1727203855.51948: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded221b0> <<< 15896 1727203855.51977: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded21fa0> <<< 15896 1727203855.51982: stdout chunk (state=3): >>># destroy ansible.module_utils.distro <<< 15896 1727203855.51996: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 15896 1727203855.52014: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52067: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52099: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 15896 1727203855.52114: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 15896 1727203855.52196: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15896 1727203855.52221: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52239: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52256: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 15896 1727203855.52279: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52373: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52472: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52502: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52532: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52598: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52659: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52714: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52762: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 15896 1727203855.52789: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.52914: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.53185: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15896 1727203855.53402: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.53682: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.53742: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.53822: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 15896 1727203855.53827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.53872: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 15896 1727203855.53899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15896 1727203855.53932: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15896 1727203855.53976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15896 1727203855.54015: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb60c0> <<< 15896 1727203855.54041: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15896 1727203855.54073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15896 1727203855.54101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15896 1727203855.54178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15896 1727203855.54201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 15896 1727203855.54232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 15896 1727203855.54383: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de96bfe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de9705f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded9eab0> <<< 15896 1727203855.54411: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb6c60> <<< 15896 1727203855.54448: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb4770> <<< 15896 1727203855.54479: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb5040> <<< 15896 1727203855.54503: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15896 1727203855.54614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15896 1727203855.54641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 15896 1727203855.54668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15896 1727203855.54695: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 15896 1727203855.54720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15896 1727203855.54761: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.54771: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.54789: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de973260> <<< 15896 1727203855.54792: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de972b10> <<< 15896 1727203855.54831: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.55089: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de972cf0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de971f40> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15896 1727203855.55097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15896 1727203855.55119: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de973440> <<< 15896 1727203855.55148: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15896 1727203855.55210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15896 1727203855.55250: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.55271: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.55274: stdout chunk (state=3): >>>import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de9d5f40> <<< 15896 1727203855.55330: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de973f20> <<< 15896 1727203855.55366: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb4530> <<< 15896 1727203855.55390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 15896 1727203855.55406: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 15896 1727203855.55436: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.55457: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.55474: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 15896 1727203855.55496: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.55594: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.55678: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15896 1727203855.55711: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.55782: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.55851: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15896 1727203855.55870: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.55899: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.55906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 15896 1727203855.55932: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.55977: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.56020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 15896 1727203855.56038: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.56113: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.56183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 15896 1727203855.56384: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 15896 1727203855.56404: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.56489: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.56579: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.56652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 15896 1727203855.56670: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 15896 1727203855.56689: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.57494: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58237: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15896 1727203855.58270: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58353: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58439: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58485: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58533: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 15896 1727203855.58547: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 15896 1727203855.58567: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58610: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15896 1727203855.58674: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58761: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58836: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15896 1727203855.58872: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.58981: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 15896 1727203855.59002: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.59038: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 15896 1727203855.59066: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.59189: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.59313: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 15896 1727203855.59330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15896 1727203855.59371: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de9d7530> <<< 15896 1727203855.59583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15896 1727203855.59632: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de9d6b70> <<< 15896 1727203855.59644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 15896 1727203855.59669: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.59777: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.59871: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 15896 1727203855.59889: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.60031: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.60166: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15896 1727203855.60191: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.60287: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.60393: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15896 1727203855.60414: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.60480: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.60549: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15896 1727203855.60631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15896 1727203855.60737: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.60843: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.60851: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dea0e1e0> <<< 15896 1727203855.61169: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de9fdfd0> <<< 15896 1727203855.61181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 15896 1727203855.61207: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.61357: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.61686: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15896 1727203855.61871: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.62099: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 15896 1727203855.62104: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 15896 1727203855.62133: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.62199: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.62260: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15896 1727203855.62282: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.62345: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.62420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 15896 1727203855.62437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15896 1727203855.62479: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203855.62689: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dea21c70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dea21be0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 15896 1727203855.62909: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.63127: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15896 1727203855.63143: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.63301: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.63481: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.63548: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.63611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 15896 1727203855.63629: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 15896 1727203855.63654: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.63689: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.63726: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.63941: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.64146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 15896 1727203855.64163: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 15896 1727203855.64185: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.64479: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.64564: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 15896 1727203855.64585: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.64644: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.64696: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.65625: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.66433: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 15896 1727203855.66445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 15896 1727203855.66558: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.66670: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15896 1727203855.66689: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.66771: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.66886: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 15896 1727203855.66889: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.67049: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.67500: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 15896 1727203855.67506: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.67647: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.67977: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.68293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 15896 1727203855.68299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 15896 1727203855.68359: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203855.68410: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15896 1727203855.68426: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.68453: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.68477: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 15896 1727203855.68566: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.68592: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.68695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 15896 1727203855.68732: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203855.68764: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 15896 1727203855.68774: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.68908: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.68933: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 15896 1727203855.68949: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.69028: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.69118: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 15896 1727203855.69664: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.69993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15896 1727203855.70000: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70079: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15896 1727203855.70156: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70177: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70216: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15896 1727203855.70222: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70271: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15896 1727203855.70311: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70353: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 15896 1727203855.70395: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70495: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70589: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15896 1727203855.70626: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 15896 1727203855.70636: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70688: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 15896 1727203855.70765: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70781: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70821: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70870: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.70941: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.71020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 15896 1727203855.71042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15896 1727203855.71082: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.71139: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15896 1727203855.71343: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.71626: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 15896 1727203855.71689: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.71694: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.71759: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 15896 1727203855.71770: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.71832: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.71906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 15896 1727203855.72094: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.72137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 15896 1727203855.72145: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 15896 1727203855.72156: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.72285: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.72419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15896 1727203855.72580: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203855.72856: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15896 1727203855.72898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15896 1727203855.72932: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de7ba6f0> <<< 15896 1727203855.73022: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de7b88f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de7b3da0> <<< 15896 1727203855.85630: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 15896 1727203855.85640: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de802720> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de801220> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203855.85643: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 15896 1727203855.85646: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de803530> <<< 15896 1727203855.85665: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de802180> <<< 15896 1727203855.85939: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15896 1727203856.12683: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa"<<< 15896 1727203856.12830: stdout chunk (state=3): >>>, "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "55", "epoch": "1727203855", "epoch_int": "1727203855", "date": "2024-09-24", "time": "14:50:55", "iso8601_micro": "2024-09-24T18:50:55.746208Z", "iso8601": "2024-09-24T18:50:55Z", "iso8601_basic": "20240924T145055746208", "iso8601_basic_short": "20240924T145055", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.5439453125, "5m": 0.31982421875, "15m": 0.15185546875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3265, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial<<< 15896 1727203856.12854: stdout chunk (state=3): >>>": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 446, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788086272, "block_size": 4096, "block_total": 65519099, "block_available": 63913107, "block_used": 1605992, "inode_total": 131070960, "inode_available": 131027261, "inode_used": 43699, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15896 1727203856.13720: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 15896 1727203856.13754: stdout chunk (state=3): >>> # clear sys.path_hooks # clear builtins._ # clear sys.path<<< 15896 1727203856.13792: stdout chunk (state=3): >>> # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins<<< 15896 1727203856.13796: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external<<< 15896 1727203856.13844: stdout chunk (state=3): >>> # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os<<< 15896 1727203856.13883: stdout chunk (state=3): >>> # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types<<< 15896 1727203856.13891: stdout chunk (state=3): >>> # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix<<< 15896 1727203856.14008: stdout chunk (state=3): >>> # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random<<< 15896 1727203856.14035: stdout chunk (state=3): >>> # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit<<< 15896 1727203856.14090: stdout chunk (state=3): >>> # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback<<< 15896 1727203856.14183: stdout chunk (state=3): >>> # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections<<< 15896 1727203856.14387: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux<<< 15896 1727203856.14432: stdout chunk (state=3): >>> # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux<<< 15896 1727203856.14476: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd<<< 15896 1727203856.14510: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline<<< 15896 1727203856.14538: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb<<< 15896 1727203856.14564: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl<<< 15896 1727203856.14638: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd<<< 15896 1727203856.14642: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues<<< 15896 1727203856.14787: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15896 1727203856.15484: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15896 1727203856.15546: stdout chunk (state=3): >>># destroy importlib.machinery <<< 15896 1727203856.15549: stdout chunk (state=3): >>># destroy importlib._abc # destroy importlib.util <<< 15896 1727203856.15696: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma<<< 15896 1727203856.15709: stdout chunk (state=3): >>> # destroy zipfile._path<<< 15896 1727203856.15745: stdout chunk (state=3): >>> # destroy zipfile # destroy pathlib<<< 15896 1727203856.15861: stdout chunk (state=3): >>> # destroy zipfile._path.glob # destroy ipaddress <<< 15896 1727203856.15903: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json<<< 15896 1727203856.15934: stdout chunk (state=3): >>> # destroy grp # destroy encodings<<< 15896 1727203856.15945: stdout chunk (state=3): >>> # destroy _locale<<< 15896 1727203856.16002: stdout chunk (state=3): >>> # destroy locale <<< 15896 1727203856.16094: stdout chunk (state=3): >>># destroy select # destroy _signal <<< 15896 1727203856.16113: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid<<< 15896 1727203856.16151: stdout chunk (state=3): >>> # destroy selinux # destroy shutil # destroy distro # destroy distro.distro<<< 15896 1727203856.16170: stdout chunk (state=3): >>> # destroy argparse <<< 15896 1727203856.16240: stdout chunk (state=3): >>># destroy logging # destroy ansible.module_utils.facts.default_collectors<<< 15896 1727203856.16279: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing <<< 15896 1727203856.16399: stdout chunk (state=3): >>># destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle <<< 15896 1727203856.16405: stdout chunk (state=3): >>># destroy _compat_pickle # destroy _pickle <<< 15896 1727203856.16590: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue<<< 15896 1727203856.16615: stdout chunk (state=3): >>> # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 15896 1727203856.16644: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass <<< 15896 1727203856.16673: stdout chunk (state=3): >>># destroy pwd # destroy termios # destroy json <<< 15896 1727203856.16724: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob<<< 15896 1727203856.16750: stdout chunk (state=3): >>> # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout<<< 15896 1727203856.16780: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 15896 1727203856.16806: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 15896 1727203856.17125: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib <<< 15896 1727203856.17128: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 15896 1727203856.17175: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser<<< 15896 1727203856.17220: stdout chunk (state=3): >>> # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc<<< 15896 1727203856.17249: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator<<< 15896 1727203856.17261: stdout chunk (state=3): >>> # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 15896 1727203856.17288: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat<<< 15896 1727203856.17327: stdout chunk (state=3): >>> # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 15896 1727203856.17355: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread<<< 15896 1727203856.17586: stdout chunk (state=3): >>> # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15896 1727203856.17724: stdout chunk (state=3): >>># destroy sys.monitoring <<< 15896 1727203856.17757: stdout chunk (state=3): >>># destroy _socket <<< 15896 1727203856.17783: stdout chunk (state=3): >>># destroy _collections<<< 15896 1727203856.17804: stdout chunk (state=3): >>> <<< 15896 1727203856.17846: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 15896 1727203856.17901: stdout chunk (state=3): >>># destroy stat # destroy genericpath <<< 15896 1727203856.18009: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 15896 1727203856.18012: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15896 1727203856.18030: stdout chunk (state=3): >>># destroy _typing <<< 15896 1727203856.18042: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse<<< 15896 1727203856.18062: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator<<< 15896 1727203856.18111: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp <<< 15896 1727203856.18207: stdout chunk (state=3): >>># destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules<<< 15896 1727203856.18210: stdout chunk (state=3): >>> # destroy _frozen_importlib <<< 15896 1727203856.18369: stdout chunk (state=3): >>># destroy codecs<<< 15896 1727203856.18427: stdout chunk (state=3): >>> # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437<<< 15896 1727203856.18430: stdout chunk (state=3): >>> # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading<<< 15896 1727203856.18480: stdout chunk (state=3): >>> # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 15896 1727203856.18500: stdout chunk (state=3): >>> # destroy _random # destroy _weakref <<< 15896 1727203856.18540: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator <<< 15896 1727203856.18669: stdout chunk (state=3): >>># destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15896 1727203856.19417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203856.19625: stderr chunk (state=3): >>><<< 15896 1727203856.19628: stdout chunk (state=3): >>><<< 15896 1727203856.19783: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df9b84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df987b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df9baa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df769130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df76a060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a7e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a7f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7df890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7dff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7bfb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7bd280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a5040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df803770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df802390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7be120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a6900> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df834830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a42c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df834ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df834b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df834f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df7a2de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df835640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df835310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df836510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df84c710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df84ddc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df84ec60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df84f290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df84e1b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df84fd10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df84f440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df836480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df543c80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df56c7d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56c530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df56c710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df56d0a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22df56da60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56c950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df541e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56ee40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56db80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df836c30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df5971d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df5bf560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df61c2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df61ea50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df61c410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df5e5340> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def29430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df5be360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22df56fda0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f22df5be480> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_l2avm4ww/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def8f110> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def6e000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def6d160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def8cfe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22defbea80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22defbe810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22defbe120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22defbe8a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22def8fda0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22defbf7a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22defbf9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22defbff20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee29c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee2b890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2c260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2d160> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2fe30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee2ff80> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2e120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee37ec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee36990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee366f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee36c60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee2e600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee7bf50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee7c260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee7dd00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee7dac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee80260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee7e3c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee83a40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee80410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee84890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee84a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee843e0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee7c410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ded103b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ded113d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee86b40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dee87ef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee867b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ded15610> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded16540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee854f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded162a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded17680> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22ded21ee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded1f200> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dee0a9c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22deefe690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded221b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded21fa0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb60c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de96bfe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de9705f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22ded9eab0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb6c60> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb4770> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb5040> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de973260> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de972b10> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de972cf0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de971f40> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de973440> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de9d5f40> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de973f20> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dedb4530> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de9d7530> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de9d6b70> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dea0e1e0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de9fdfd0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22dea21c70> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22dea21be0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f22de7ba6f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de7b88f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de7b3da0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de802720> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de801220> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de803530> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f22de802180> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "55", "epoch": "1727203855", "epoch_int": "1727203855", "date": "2024-09-24", "time": "14:50:55", "iso8601_micro": "2024-09-24T18:50:55.746208Z", "iso8601": "2024-09-24T18:50:55Z", "iso8601_basic": "20240924T145055746208", "iso8601_basic_short": "20240924T145055", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.5439453125, "5m": 0.31982421875, "15m": 0.15185546875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2909, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 622, "free": 2909}, "nocache": {"free": 3265, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 446, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788086272, "block_size": 4096, "block_total": 65519099, "block_available": 63913107, "block_used": 1605992, "inode_total": 131070960, "inode_available": 131027261, "inode_used": 43699, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15896 1727203856.22179: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203856.22183: _low_level_execute_command(): starting 15896 1727203856.22185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203854.4823227-15992-47039306806491/ > /dev/null 2>&1 && sleep 0' 15896 1727203856.22695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203856.22929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203856.23096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203856.23164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203856.25755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203856.25809: stderr chunk (state=3): >>><<< 15896 1727203856.25995: stdout chunk (state=3): >>><<< 15896 1727203856.26013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203856.26024: handler run complete 15896 1727203856.26153: variable 'ansible_facts' from source: unknown 15896 1727203856.26343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203856.29354: variable 'ansible_facts' from source: unknown 15896 1727203856.29707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203856.29830: attempt loop complete, returning result 15896 1727203856.29834: _execute() done 15896 1727203856.29836: dumping result to json 15896 1727203856.29866: done dumping result, returning 15896 1727203856.29874: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-fb83-b6ad-0000000001bc] 15896 1727203856.29878: sending task result for task 028d2410-947f-fb83-b6ad-0000000001bc ok: [managed-node1] 15896 1727203856.31754: no more pending results, returning what we have 15896 1727203856.31758: results queue empty 15896 1727203856.31759: checking for any_errors_fatal 15896 1727203856.31763: done checking for any_errors_fatal 15896 1727203856.31764: checking for max_fail_percentage 15896 1727203856.31765: done checking for max_fail_percentage 15896 1727203856.31766: checking to see if all hosts have failed and the running result is not ok 15896 1727203856.31767: done checking to see if all hosts have failed 15896 1727203856.31768: getting the remaining hosts for this loop 15896 1727203856.31769: done getting the remaining hosts for this loop 15896 1727203856.31773: getting the next task for host managed-node1 15896 1727203856.31782: done getting next task for host managed-node1 15896 1727203856.31784: ^ task is: TASK: meta (flush_handlers) 15896 1727203856.31787: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203856.31792: getting variables 15896 1727203856.31793: in VariableManager get_vars() 15896 1727203856.31817: Calling all_inventory to load vars for managed-node1 15896 1727203856.31941: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001bc 15896 1727203856.31944: WORKER PROCESS EXITING 15896 1727203856.31947: Calling groups_inventory to load vars for managed-node1 15896 1727203856.31951: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203856.31964: Calling all_plugins_play to load vars for managed-node1 15896 1727203856.31967: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203856.31970: Calling groups_plugins_play to load vars for managed-node1 15896 1727203856.32318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203856.32680: done with get_vars() 15896 1727203856.32691: done getting variables 15896 1727203856.32884: in VariableManager get_vars() 15896 1727203856.32894: Calling all_inventory to load vars for managed-node1 15896 1727203856.32896: Calling groups_inventory to load vars for managed-node1 15896 1727203856.32899: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203856.32904: Calling all_plugins_play to load vars for managed-node1 15896 1727203856.32906: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203856.32944: Calling groups_plugins_play to load vars for managed-node1 15896 1727203856.33349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203856.33921: done with get_vars() 15896 1727203856.33935: done queuing things up, now waiting for results queue to drain 15896 1727203856.33937: results queue empty 15896 1727203856.33938: checking for any_errors_fatal 15896 1727203856.33941: done checking for any_errors_fatal 15896 1727203856.33941: checking for max_fail_percentage 15896 1727203856.33942: done checking for max_fail_percentage 15896 1727203856.33943: checking to see if all hosts have failed and the running result is not ok 15896 1727203856.33944: done checking to see if all hosts have failed 15896 1727203856.33949: getting the remaining hosts for this loop 15896 1727203856.33950: done getting the remaining hosts for this loop 15896 1727203856.33953: getting the next task for host managed-node1 15896 1727203856.33958: done getting next task for host managed-node1 15896 1727203856.33964: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15896 1727203856.33966: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203856.33968: getting variables 15896 1727203856.33969: in VariableManager get_vars() 15896 1727203856.33979: Calling all_inventory to load vars for managed-node1 15896 1727203856.33982: Calling groups_inventory to load vars for managed-node1 15896 1727203856.33984: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203856.33990: Calling all_plugins_play to load vars for managed-node1 15896 1727203856.33992: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203856.33995: Calling groups_plugins_play to load vars for managed-node1 15896 1727203856.34157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203856.34389: done with get_vars() 15896 1727203856.34397: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:11 Tuesday 24 September 2024 14:50:56 -0400 (0:00:01.914) 0:00:01.934 ***** 15896 1727203856.34483: entering _queue_task() for managed-node1/include_tasks 15896 1727203856.34485: Creating lock for include_tasks 15896 1727203856.34910: worker is 1 (out of 1 available) 15896 1727203856.34922: exiting _queue_task() for managed-node1/include_tasks 15896 1727203856.34933: done queuing things up, now waiting for results queue to drain 15896 1727203856.34935: waiting for pending results... 15896 1727203856.35455: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 15896 1727203856.35990: in run() - task 028d2410-947f-fb83-b6ad-000000000006 15896 1727203856.36015: variable 'ansible_search_path' from source: unknown 15896 1727203856.36109: calling self._execute() 15896 1727203856.36392: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203856.36395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203856.36397: variable 'omit' from source: magic vars 15896 1727203856.36681: _execute() done 15896 1727203856.36685: dumping result to json 15896 1727203856.36687: done dumping result, returning 15896 1727203856.36689: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [028d2410-947f-fb83-b6ad-000000000006] 15896 1727203856.36692: sending task result for task 028d2410-947f-fb83-b6ad-000000000006 15896 1727203856.36772: done sending task result for task 028d2410-947f-fb83-b6ad-000000000006 15896 1727203856.36777: WORKER PROCESS EXITING 15896 1727203856.36824: no more pending results, returning what we have 15896 1727203856.36830: in VariableManager get_vars() 15896 1727203856.36865: Calling all_inventory to load vars for managed-node1 15896 1727203856.36869: Calling groups_inventory to load vars for managed-node1 15896 1727203856.36873: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203856.36895: Calling all_plugins_play to load vars for managed-node1 15896 1727203856.36899: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203856.36903: Calling groups_plugins_play to load vars for managed-node1 15896 1727203856.37711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203856.38136: done with get_vars() 15896 1727203856.38143: variable 'ansible_search_path' from source: unknown 15896 1727203856.38157: we have included files to process 15896 1727203856.38159: generating all_blocks data 15896 1727203856.38163: done generating all_blocks data 15896 1727203856.38164: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15896 1727203856.38165: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15896 1727203856.38168: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15896 1727203856.39148: in VariableManager get_vars() 15896 1727203856.39169: done with get_vars() 15896 1727203856.39296: done processing included file 15896 1727203856.39299: iterating over new_blocks loaded from include file 15896 1727203856.39300: in VariableManager get_vars() 15896 1727203856.39314: done with get_vars() 15896 1727203856.39316: filtering new block on tags 15896 1727203856.39330: done filtering new block on tags 15896 1727203856.39333: in VariableManager get_vars() 15896 1727203856.39343: done with get_vars() 15896 1727203856.39344: filtering new block on tags 15896 1727203856.39400: done filtering new block on tags 15896 1727203856.39403: in VariableManager get_vars() 15896 1727203856.39414: done with get_vars() 15896 1727203856.39416: filtering new block on tags 15896 1727203856.39513: done filtering new block on tags 15896 1727203856.39515: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 15896 1727203856.39521: extending task lists for all hosts with included blocks 15896 1727203856.39655: done extending task lists 15896 1727203856.39656: done processing included files 15896 1727203856.39657: results queue empty 15896 1727203856.39658: checking for any_errors_fatal 15896 1727203856.39662: done checking for any_errors_fatal 15896 1727203856.39663: checking for max_fail_percentage 15896 1727203856.39664: done checking for max_fail_percentage 15896 1727203856.39664: checking to see if all hosts have failed and the running result is not ok 15896 1727203856.39665: done checking to see if all hosts have failed 15896 1727203856.39666: getting the remaining hosts for this loop 15896 1727203856.39667: done getting the remaining hosts for this loop 15896 1727203856.39669: getting the next task for host managed-node1 15896 1727203856.39673: done getting next task for host managed-node1 15896 1727203856.39677: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15896 1727203856.39679: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203856.39681: getting variables 15896 1727203856.39682: in VariableManager get_vars() 15896 1727203856.39690: Calling all_inventory to load vars for managed-node1 15896 1727203856.39692: Calling groups_inventory to load vars for managed-node1 15896 1727203856.39694: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203856.39700: Calling all_plugins_play to load vars for managed-node1 15896 1727203856.39702: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203856.39705: Calling groups_plugins_play to load vars for managed-node1 15896 1727203856.40446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203856.40739: done with get_vars() 15896 1727203856.40749: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:50:56 -0400 (0:00:00.063) 0:00:01.997 ***** 15896 1727203856.40838: entering _queue_task() for managed-node1/setup 15896 1727203856.41217: worker is 1 (out of 1 available) 15896 1727203856.41230: exiting _queue_task() for managed-node1/setup 15896 1727203856.41584: done queuing things up, now waiting for results queue to drain 15896 1727203856.41587: waiting for pending results... 15896 1727203856.41695: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 15896 1727203856.41928: in run() - task 028d2410-947f-fb83-b6ad-0000000001cd 15896 1727203856.41945: variable 'ansible_search_path' from source: unknown 15896 1727203856.42057: variable 'ansible_search_path' from source: unknown 15896 1727203856.42393: calling self._execute() 15896 1727203856.42397: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203856.42400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203856.42870: variable 'omit' from source: magic vars 15896 1727203856.43634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203856.47574: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203856.47651: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203856.47703: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203856.47769: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203856.47808: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203856.47906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203856.47942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203856.47980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203856.48031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203856.48052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203856.48282: variable 'ansible_facts' from source: unknown 15896 1727203856.48339: variable 'network_test_required_facts' from source: task vars 15896 1727203856.48388: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15896 1727203856.48400: variable 'omit' from source: magic vars 15896 1727203856.48498: variable 'omit' from source: magic vars 15896 1727203856.48502: variable 'omit' from source: magic vars 15896 1727203856.48516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203856.48554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203856.48578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203856.48605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203856.48622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203856.48668: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203856.48679: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203856.48716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203856.48798: Set connection var ansible_shell_type to sh 15896 1727203856.48811: Set connection var ansible_connection to ssh 15896 1727203856.49079: Set connection var ansible_shell_executable to /bin/sh 15896 1727203856.49083: Set connection var ansible_pipelining to False 15896 1727203856.49085: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203856.49087: Set connection var ansible_timeout to 10 15896 1727203856.49089: variable 'ansible_shell_executable' from source: unknown 15896 1727203856.49091: variable 'ansible_connection' from source: unknown 15896 1727203856.49094: variable 'ansible_module_compression' from source: unknown 15896 1727203856.49095: variable 'ansible_shell_type' from source: unknown 15896 1727203856.49097: variable 'ansible_shell_executable' from source: unknown 15896 1727203856.49099: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203856.49101: variable 'ansible_pipelining' from source: unknown 15896 1727203856.49103: variable 'ansible_timeout' from source: unknown 15896 1727203856.49104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203856.49801: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203856.49805: variable 'omit' from source: magic vars 15896 1727203856.49807: starting attempt loop 15896 1727203856.49809: running the handler 15896 1727203856.49811: _low_level_execute_command(): starting 15896 1727203856.49813: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203856.51693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203856.51794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203856.52007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203856.52284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203856.54061: stdout chunk (state=3): >>>/root <<< 15896 1727203856.54193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203856.54198: stdout chunk (state=3): >>><<< 15896 1727203856.54207: stderr chunk (state=3): >>><<< 15896 1727203856.54229: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203856.54243: _low_level_execute_command(): starting 15896 1727203856.54249: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995 `" && echo ansible-tmp-1727203856.5422974-16144-11339932657995="` echo /root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995 `" ) && sleep 0' 15896 1727203856.55320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203856.55493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203856.55504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203856.55519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203856.55531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203856.55538: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203856.55548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203856.55565: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203856.55582: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203856.55588: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203856.55591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203856.55600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203856.55612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203856.55621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203856.55627: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203856.55637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203856.55711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203856.55985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203856.55989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203856.56104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203856.58119: stdout chunk (state=3): >>>ansible-tmp-1727203856.5422974-16144-11339932657995=/root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995 <<< 15896 1727203856.58269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203856.58272: stdout chunk (state=3): >>><<< 15896 1727203856.58281: stderr chunk (state=3): >>><<< 15896 1727203856.58301: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203856.5422974-16144-11339932657995=/root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203856.58352: variable 'ansible_module_compression' from source: unknown 15896 1727203856.58405: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15896 1727203856.58460: variable 'ansible_facts' from source: unknown 15896 1727203856.58870: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/AnsiballZ_setup.py 15896 1727203856.59337: Sending initial data 15896 1727203856.59340: Sent initial data (153 bytes) 15896 1727203856.60824: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203856.60827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203856.60933: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203856.61116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203856.61195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203856.63093: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203856.63405: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203856.63427: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpnkbjarlw /root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/AnsiballZ_setup.py <<< 15896 1727203856.63430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/AnsiballZ_setup.py" <<< 15896 1727203856.63536: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpnkbjarlw" to remote "/root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/AnsiballZ_setup.py" <<< 15896 1727203856.66093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203856.66128: stderr chunk (state=3): >>><<< 15896 1727203856.66165: stdout chunk (state=3): >>><<< 15896 1727203856.66193: done transferring module to remote 15896 1727203856.66212: _low_level_execute_command(): starting 15896 1727203856.66222: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/ /root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/AnsiballZ_setup.py && sleep 0' 15896 1727203856.66885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203856.66900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203856.66936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203856.66949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203856.67016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203856.67054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203856.67107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15896 1727203856.69814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203856.69841: stderr chunk (state=3): >>><<< 15896 1727203856.69862: stdout chunk (state=3): >>><<< 15896 1727203856.69938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15896 1727203856.69942: _low_level_execute_command(): starting 15896 1727203856.69945: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/AnsiballZ_setup.py && sleep 0' 15896 1727203856.71167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203856.71171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203856.71173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203856.71184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203856.71288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203856.71292: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203856.71294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203856.71297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203856.71299: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203856.71301: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203856.71303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203856.71304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203856.71306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203856.71308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203856.71310: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203856.71312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203856.71395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203856.71516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203856.74659: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15896 1727203856.74679: stdout chunk (state=3): >>>import _imp # builtin <<< 15896 1727203856.74701: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 15896 1727203856.74710: stdout chunk (state=3): >>>import '_weakref' # <<< 15896 1727203856.74858: stdout chunk (state=3): >>>import '_io' # <<< 15896 1727203856.74955: stdout chunk (state=3): >>>import 'marshal' # import 'posix' # <<< 15896 1727203856.74959: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 15896 1727203856.74961: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 15896 1727203856.75335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 15896 1727203856.75344: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3181684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318137b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31816aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 15896 1727203856.75351: stdout chunk (state=3): >>>import '_stat' # <<< 15896 1727203856.75353: stdout chunk (state=3): >>>import 'stat' # <<< 15896 1727203856.75356: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15896 1727203856.75358: stdout chunk (state=3): >>>import 'genericpath' # <<< 15896 1727203856.75411: stdout chunk (state=3): >>>import 'posixpath' # <<< 15896 1727203856.75415: stdout chunk (state=3): >>>import 'os' # <<< 15896 1727203856.75419: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15896 1727203856.75706: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f5d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f5e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15896 1727203856.76042: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15896 1727203856.76068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15896 1727203856.76087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203856.76133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15896 1727203856.76151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15896 1727203856.76164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15896 1727203856.76223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15896 1727203856.76260: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f9bf80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15896 1727203856.76562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fb0110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fd3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fd3fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fb3bf0> <<< 15896 1727203856.76574: stdout chunk (state=3): >>>import '_functools' # <<< 15896 1727203856.76685: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fb12e0> <<< 15896 1727203856.76704: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f99130> <<< 15896 1727203856.76720: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15896 1727203856.76732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15896 1727203856.76781: stdout chunk (state=3): >>>import '_sre' # <<< 15896 1727203856.76788: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15896 1727203856.76811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15896 1727203856.76814: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 15896 1727203856.76856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15896 1727203856.76910: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317ff78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317ff6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fb2390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317ff4d40> <<< 15896 1727203856.77187: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318024950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f983b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd318024e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318024cb0> <<< 15896 1727203856.77192: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.77200: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3180250a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f96ed0> <<< 15896 1727203856.77216: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15896 1727203856.77292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318025790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318025460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318026660> import 'importlib.util' # import 'runpy' # <<< 15896 1727203856.77323: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15896 1727203856.77355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15896 1727203856.77460: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318040890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd318041fd0> <<< 15896 1727203856.77538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 15896 1727203856.77995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318042e70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3180434a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3180423c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd318043e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318043590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3180266c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d37d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d60860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d605c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d60890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15896 1727203856.78090: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.78508: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d611c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d61bb0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d60a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d35eb0> <<< 15896 1727203856.78569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15896 1727203856.78687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15896 1727203856.78690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15896 1727203856.78692: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d62f90> <<< 15896 1727203856.78694: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d61d00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318026db0> <<< 15896 1727203856.78844: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15896 1727203856.78850: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203856.78856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15896 1727203856.78859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15896 1727203856.78941: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d8b2f0> <<< 15896 1727203856.79159: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203856.79168: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15896 1727203856.79173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317daf6e0> <<< 15896 1727203856.79179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15896 1727203856.79186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15896 1727203856.79365: stdout chunk (state=3): >>>import 'ntpath' # <<< 15896 1727203856.79392: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317e104d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15896 1727203856.79482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15896 1727203856.79587: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317e12c30> <<< 15896 1727203856.79796: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317e105f0> <<< 15896 1727203856.79828: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317dd94c0> <<< 15896 1727203856.79948: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177295e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317dae4e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d63ec0> <<< 15896 1727203856.80282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15896 1727203856.80345: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd317dae840> <<< 15896 1727203856.81084: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_rznnixv2/ansible_setup_payload.zip' <<< 15896 1727203856.81101: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203856.81199: stdout chunk (state=3): >>> <<< 15896 1727203856.81306: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203856.81310: stdout chunk (state=3): >>> <<< 15896 1727203856.81346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 15896 1727203856.81397: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15896 1727203856.81490: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15896 1727203856.81666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 15896 1727203856.81689: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31778f2c0> <<< 15896 1727203856.81750: stdout chunk (state=3): >>>import '_typing' # <<< 15896 1727203856.82045: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177721b0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317771340><<< 15896 1727203856.82081: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15896 1727203856.82197: stdout chunk (state=3): >>> <<< 15896 1727203856.82282: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 15896 1727203856.84567: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203856.84795: stdout chunk (state=3): >>> <<< 15896 1727203856.86608: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 15896 1727203856.86781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31778d160> <<< 15896 1727203856.86785: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 15896 1727203856.86788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203856.86790: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15896 1727203856.86884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15896 1727203856.86921: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3177bebd0> <<< 15896 1727203856.86985: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177be960><<< 15896 1727203856.87040: stdout chunk (state=3): >>> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177be270> <<< 15896 1727203856.87115: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 15896 1727203856.87180: stdout chunk (state=3): >>> import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177bed80><<< 15896 1727203856.87183: stdout chunk (state=3): >>> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31778fce0><<< 15896 1727203856.87262: stdout chunk (state=3): >>> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.87281: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.87355: stdout chunk (state=3): >>>import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3177bf920> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.87456: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.87460: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3177bfb60> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15896 1727203856.87480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 15896 1727203856.87516: stdout chunk (state=3): >>> import '_locale' # <<< 15896 1727203856.87674: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177e8050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15896 1727203856.87788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 15896 1727203856.87796: stdout chunk (state=3): >>> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31762ddf0> <<< 15896 1727203856.87803: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203856.88015: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd31762fa10> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317634410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15896 1727203856.88073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 15896 1727203856.88112: stdout chunk (state=3): >>> import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317635310><<< 15896 1727203856.88125: stdout chunk (state=3): >>> <<< 15896 1727203856.88143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 15896 1727203856.88211: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 15896 1727203856.88240: stdout chunk (state=3): >>> <<< 15896 1727203856.88264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 15896 1727203856.88381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15896 1727203856.88384: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317637f50> <<< 15896 1727203856.88465: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317638140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3176362d0> <<< 15896 1727203856.88556: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15896 1727203856.88591: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 15896 1727203856.88761: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15896 1727203856.88852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15896 1727203856.88914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31763bf20><<< 15896 1727203856.88933: stdout chunk (state=3): >>> import '_tokenize' # <<< 15896 1727203856.89040: stdout chunk (state=3): >>> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31763a9f0><<< 15896 1727203856.89121: stdout chunk (state=3): >>> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31763a750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 15896 1727203856.89127: stdout chunk (state=3): >>> <<< 15896 1727203856.89258: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31763acc0> <<< 15896 1727203856.89382: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3176367e0><<< 15896 1727203856.89385: stdout chunk (state=3): >>> <<< 15896 1727203856.89429: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203856.89446: stdout chunk (state=3): >>> # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317680170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203856.89470: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317680290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 15896 1727203856.89488: stdout chunk (state=3): >>> <<< 15896 1727203856.89513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 15896 1727203856.89552: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 15896 1727203856.89664: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317681e50><<< 15896 1727203856.89746: stdout chunk (state=3): >>> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317681c10> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15896 1727203856.89785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15896 1727203856.89829: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.89838: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3176843b0> <<< 15896 1727203856.89866: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317682540><<< 15896 1727203856.89977: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15896 1727203856.89992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 15896 1727203856.90013: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15896 1727203856.90077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 15896 1727203856.90085: stdout chunk (state=3): >>> import '_string' # <<< 15896 1727203856.90139: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317687b90><<< 15896 1727203856.90149: stdout chunk (state=3): >>> <<< 15896 1727203856.90418: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317684560> <<< 15896 1727203856.90509: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203856.90512: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.90625: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317688c50> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203856.90631: stdout chunk (state=3): >>> <<< 15896 1727203856.90682: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317688bf0> <<< 15896 1727203856.90797: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317688ec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317680590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15896 1727203856.90814: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.90853: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203856.90962: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317514650> <<< 15896 1727203856.91183: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203856.91186: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203856.91195: stdout chunk (state=3): >>> import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3175158e0> <<< 15896 1727203856.91218: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31768ade0> <<< 15896 1727203856.91263: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203856.91383: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd31768b9b0><<< 15896 1727203856.91411: stdout chunk (state=3): >>> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31768a9c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15896 1727203856.91681: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 15896 1727203856.91687: stdout chunk (state=3): >>> <<< 15896 1727203856.91693: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203856.91703: stdout chunk (state=3): >>> <<< 15896 1727203856.91744: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 15896 1727203856.91881: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 15896 1727203856.91885: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15896 1727203856.92037: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203856.92293: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203856.92386: stdout chunk (state=3): >>> <<< 15896 1727203856.93228: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203856.93337: stdout chunk (state=3): >>> <<< 15896 1727203856.94314: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 15896 1727203856.94332: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15896 1727203856.94370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 15896 1727203856.94388: stdout chunk (state=3): >>> <<< 15896 1727203856.94443: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203856.94545: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317519af0> <<< 15896 1727203856.94633: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 15896 1727203856.94653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31751a870> <<< 15896 1727203856.94683: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317515c10> <<< 15896 1727203856.94854: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available<<< 15896 1727203856.94983: stdout chunk (state=3): >>> <<< 15896 1727203856.95115: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203856.95377: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15896 1727203856.95395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc'<<< 15896 1727203856.95419: stdout chunk (state=3): >>> <<< 15896 1727203856.95581: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31751af90> <<< 15896 1727203856.95585: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203856.96294: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203856.97083: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203856.97102: stdout chunk (state=3): >>> <<< 15896 1727203856.97216: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203856.97336: stdout chunk (state=3): >>> import 'ansible.module_utils.common.collections' # <<< 15896 1727203856.97372: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203856.97481: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203856.97539: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 15896 1727203856.97656: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203856.98081: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15896 1727203856.98087: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15896 1727203856.98502: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203856.98865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 15896 1727203856.98868: stdout chunk (state=3): >>> <<< 15896 1727203856.98995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 15896 1727203856.99082: stdout chunk (state=3): >>> <<< 15896 1727203856.99180: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31751bb00> <<< 15896 1727203856.99267: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 15896 1727203856.99283: stdout chunk (state=3): >>> <<< 15896 1727203856.99388: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15896 1727203856.99413: stdout chunk (state=3): >>> import 'ansible.module_utils.common.validation' # <<< 15896 1727203856.99478: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15896 1727203856.99709: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available<<< 15896 1727203856.99723: stdout chunk (state=3): >>> <<< 15896 1727203856.99841: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.00005: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15896 1727203857.00225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203857.00253: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3175265a0><<< 15896 1727203857.00580: stdout chunk (state=3): >>> <<< 15896 1727203857.00583: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317522f60><<< 15896 1727203857.00586: stdout chunk (state=3): >>> <<< 15896 1727203857.00588: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 15896 1727203857.00590: stdout chunk (state=3): >>> import 'ansible.module_utils.common.process' # <<< 15896 1727203857.00592: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.00594: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203857.00596: stdout chunk (state=3): >>> <<< 15896 1727203857.00678: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203857.00707: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15896 1727203857.00836: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 15896 1727203857.00866: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15896 1727203857.00891: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 15896 1727203857.00957: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15896 1727203857.01067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15896 1727203857.01095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 15896 1727203857.01198: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31760ed50><<< 15896 1727203857.01288: stdout chunk (state=3): >>> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177eea20> <<< 15896 1727203857.01410: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317526360> <<< 15896 1727203857.01444: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31751cb90> # destroy ansible.module_utils.distro <<< 15896 1727203857.01494: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 15896 1727203857.01619: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203857.01622: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15896 1727203857.01694: stdout chunk (state=3): >>> import 'ansible.module_utils.basic' # <<< 15896 1727203857.01756: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15896 1727203857.01832: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 15896 1727203857.01844: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.02060: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.02063: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203857.02101: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.02161: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203857.02190: stdout chunk (state=3): >>> <<< 15896 1727203857.02228: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203857.02357: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15896 1727203857.02385: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available<<< 15896 1727203857.02487: stdout chunk (state=3): >>> <<< 15896 1727203857.02531: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.02655: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.02702: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.02763: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 15896 1727203857.02794: stdout chunk (state=3): >>> <<< 15896 1727203857.02804: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.03089: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203857.03189: stdout chunk (state=3): >>> <<< 15896 1727203857.03389: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203857.03465: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15896 1727203857.03567: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 15896 1727203857.03690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203857.03704: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py<<< 15896 1727203857.03726: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc'<<< 15896 1727203857.03746: stdout chunk (state=3): >>> <<< 15896 1727203857.03781: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b6b70><<< 15896 1727203857.03797: stdout chunk (state=3): >>> <<< 15896 1727203857.03908: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15896 1727203857.03936: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc'<<< 15896 1727203857.04209: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317160500> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317160860> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175a0a70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b7680> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b5250> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b4e00> <<< 15896 1727203857.04303: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15896 1727203857.04334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15896 1727203857.04362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py<<< 15896 1727203857.04374: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15896 1727203857.04457: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203857.04486: stdout chunk (state=3): >>> # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203857.04559: stdout chunk (state=3): >>>import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3171637d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317163080> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so'<<< 15896 1727203857.04683: stdout chunk (state=3): >>> # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203857.04686: stdout chunk (state=3): >>>import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317163260> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171624b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15896 1727203857.04873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc'<<< 15896 1727203857.04904: stdout chunk (state=3): >>> import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317163950><<< 15896 1727203857.05016: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15896 1727203857.05144: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3171c2450> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171c0470> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b4f50> <<< 15896 1727203857.05239: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 15896 1727203857.05264: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 15896 1727203857.05288: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15896 1727203857.05478: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 15896 1727203857.05566: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.05612: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203857.05666: stdout chunk (state=3): >>> <<< 15896 1727203857.05703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15896 1727203857.05768: stdout chunk (state=3): >>> <<< 15896 1727203857.06023: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 15896 1727203857.06027: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 15896 1727203857.06039: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.06116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 15896 1727203857.06145: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15896 1727203857.06182: stdout chunk (state=3): >>> <<< 15896 1727203857.06227: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.06301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 15896 1727203857.06319: stdout chunk (state=3): >>> # zipimport: zlib available<<< 15896 1727203857.06412: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15896 1727203857.06611: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.06615: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.06703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 15896 1727203857.06803: stdout chunk (state=3): >>> <<< 15896 1727203857.06816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 15896 1727203857.07588: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.08337: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15896 1727203857.08382: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.08432: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.08504: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.08550: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.08600: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 15896 1727203857.08646: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203857.08680: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15896 1727203857.08750: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.08848: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.08880: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 15896 1727203857.08894: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.08922: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 15896 1727203857.08955: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.09018: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 15896 1727203857.09138: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.09270: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15896 1727203857.09449: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171c2750> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15896 1727203857.09466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15896 1727203857.09551: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171c3320> import 'ansible.module_utils.facts.system.local' # <<< 15896 1727203857.09680: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.09683: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.09742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 15896 1727203857.09901: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203857.10022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15896 1727203857.10043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.10122: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.10236: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 15896 1727203857.10285: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.10357: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15896 1727203857.10424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15896 1727203857.10519: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203857.10656: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3171fe6f0> <<< 15896 1727203857.10991: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171ec9b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 15896 1727203857.11014: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.11105: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 15896 1727203857.11281: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.11352: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.11536: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.11766: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 15896 1727203857.11778: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.11807: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.11907: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 15896 1727203857.11996: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15896 1727203857.12020: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203857.12117: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203857.12135: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3172161e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171fc3e0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 15896 1727203857.12152: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203857.12232: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 15896 1727203857.12452: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.12694: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 15896 1727203857.12829: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.12983: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.13089: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # <<< 15896 1727203857.13119: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 15896 1727203857.13134: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.13189: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.13365: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.13687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 15896 1727203857.13780: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.13963: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 15896 1727203857.13984: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.14018: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.14059: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.14940: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.15756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 15896 1727203857.15982: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203857.16115: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15896 1727203857.16118: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.16265: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.16411: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 15896 1727203857.16423: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.16654: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.16899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 15896 1727203857.16933: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 15896 1727203857.16991: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.17053: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 15896 1727203857.17074: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.17274: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.17330: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.17666: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.18022: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15896 1727203857.18049: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203857.18218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 15896 1727203857.18536: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.18539: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 15896 1727203857.18554: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.18615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15896 1727203857.18634: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.18904: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.19305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 15896 1727203857.19329: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.19369: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15896 1727203857.19392: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.19483: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.19486: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15896 1727203857.19489: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.19490: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.19524: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 15896 1727203857.19538: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.19613: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.19709: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15896 1727203857.19797: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 15896 1727203857.19861: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 15896 1727203857.19917: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.19972: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15896 1727203857.20286: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15896 1727203857.20448: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.20641: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 15896 1727203857.20663: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.20697: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.20740: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 15896 1727203857.20764: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.20800: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.20853: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 15896 1727203857.20945: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.21031: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15896 1727203857.21049: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.21128: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203857.21401: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 15896 1727203857.21537: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 15896 1727203857.21540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 15896 1727203857.21564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15896 1727203857.21606: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317012de0> <<< 15896 1727203857.21624: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3170127b0> <<< 15896 1727203857.21657: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317011280> <<< 15896 1727203857.23209: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon S<<< 15896 1727203857.23306: stdout chunk (state=3): >>>ep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "57", "epoch": "1727203857", "epoch_int": "1727203857", "date": "2024-09-24", "time": "14:50:57", "iso8601_micro": "2024-09-24T18:50:57.228719Z", "iso8601": "2024-09-24T18:50:57Z", "iso8601_basic": "20240924T145057228719", "iso8601_basic_short": "20240924T145057", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15896 1727203857.24364: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path <<< 15896 1727203857.24369: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath<<< 15896 1727203857.24424: stdout chunk (state=3): >>> # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 15896 1727203857.24452: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib<<< 15896 1727203857.24608: stdout chunk (state=3): >>> # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex<<< 15896 1727203857.24621: stdout chunk (state=3): >>> # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog <<< 15896 1727203857.24624: stdout chunk (state=3): >>># cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal<<< 15896 1727203857.24772: stdout chunk (state=3): >>> # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six<<< 15896 1727203857.24776: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text<<< 15896 1727203857.24806: stdout chunk (state=3): >>> # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue <<< 15896 1727203857.24841: stdout chunk (state=3): >>># cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg<<< 15896 1727203857.24879: stdout chunk (state=3): >>> # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd<<< 15896 1727203857.25109: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd<<< 15896 1727203857.25112: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux<<< 15896 1727203857.25125: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15896 1727203857.25854: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 15896 1727203857.25884: stdout chunk (state=3): >>> # destroy importlib.machinery<<< 15896 1727203857.25930: stdout chunk (state=3): >>> # destroy importlib._abc # destroy importlib.util # destroy _bz2 <<< 15896 1727203857.25953: stdout chunk (state=3): >>># destroy _compression # destroy _lzma<<< 15896 1727203857.26003: stdout chunk (state=3): >>> # destroy _blake2 # destroy binascii<<< 15896 1727203857.26007: stdout chunk (state=3): >>> # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 15896 1727203857.26036: stdout chunk (state=3): >>> # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 15896 1727203857.26130: stdout chunk (state=3): >>> # destroy ntpath <<< 15896 1727203857.26133: stdout chunk (state=3): >>># destroy importlib <<< 15896 1727203857.26156: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib<<< 15896 1727203857.26224: stdout chunk (state=3): >>> # destroy json.decoder # destroy json.encoder # destroy json.scanner<<< 15896 1727203857.26429: stdout chunk (state=3): >>> # destroy _json # destroy grp # destroy encodings # destroy _locale <<< 15896 1727203857.26432: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal<<< 15896 1727203857.26450: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro <<< 15896 1727203857.26508: stdout chunk (state=3): >>># destroy argparse # destroy logging<<< 15896 1727203857.26542: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 15896 1727203857.26573: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection <<< 15896 1727203857.26657: stdout chunk (state=3): >>># destroy multiprocessing.pool # destroy signal<<< 15896 1727203857.26688: stdout chunk (state=3): >>> # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq<<< 15896 1727203857.26726: stdout chunk (state=3): >>> # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors<<< 15896 1727203857.26815: stdout chunk (state=3): >>> # destroy _multiprocessing # destroy shlex # destroy fcntl<<< 15896 1727203857.26818: stdout chunk (state=3): >>> # destroy datetime <<< 15896 1727203857.26820: stdout chunk (state=3): >>># destroy subprocess <<< 15896 1727203857.26864: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 15896 1727203857.26971: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 15896 1727203857.26974: stdout chunk (state=3): >>> # destroy getpass # destroy pwd<<< 15896 1727203857.26979: stdout chunk (state=3): >>> # destroy termios # destroy errno <<< 15896 1727203857.27104: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch<<< 15896 1727203857.27137: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux<<< 15896 1727203857.27234: stdout chunk (state=3): >>> <<< 15896 1727203857.27241: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 15896 1727203857.27450: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg<<< 15896 1727203857.27488: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 15896 1727203857.27502: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp<<< 15896 1727203857.27556: stdout chunk (state=3): >>> # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 15896 1727203857.27591: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon<<< 15896 1727203857.27678: stdout chunk (state=3): >>> # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15896 1727203857.28184: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib<<< 15896 1727203857.28191: stdout chunk (state=3): >>> # destroy _typing <<< 15896 1727203857.28193: stdout chunk (state=3): >>># destroy _tokenize <<< 15896 1727203857.28211: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 15896 1727203857.28391: stdout chunk (state=3): >>> # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15896 1727203857.28447: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases<<< 15896 1727203857.28493: stdout chunk (state=3): >>> # destroy encodings.utf_8 # destroy encodings.utf_8_sig <<< 15896 1727203857.28521: stdout chunk (state=3): >>># destroy encodings.cp437 # destroy encodings.idna # destroy _codecs<<< 15896 1727203857.28587: stdout chunk (state=3): >>> # destroy io # destroy traceback # destroy warnings<<< 15896 1727203857.28708: stdout chunk (state=3): >>> # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 15896 1727203857.28738: stdout chunk (state=3): >>># destroy _hashlib<<< 15896 1727203857.28773: stdout chunk (state=3): >>> # destroy _operator<<< 15896 1727203857.28810: stdout chunk (state=3): >>> # destroy _sre # destroy _string # destroy re # destroy itertools <<< 15896 1727203857.28959: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15896 1727203857.29514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203857.29517: stdout chunk (state=3): >>><<< 15896 1727203857.29520: stderr chunk (state=3): >>><<< 15896 1727203857.30223: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3181684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318137b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31816aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f5d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f5e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f9bf80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fb0110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fd3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fd3fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fb3bf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fb12e0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f99130> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317ff78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317ff6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317fb2390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317ff4d40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318024950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f983b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd318024e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318024cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3180250a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317f96ed0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318025790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318025460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318026660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318040890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd318041fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318042e70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3180434a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3180423c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd318043e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318043590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3180266c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d37d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d60860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d605c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d60890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d611c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317d61bb0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d60a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d35eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d62f90> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d61d00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd318026db0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d8b2f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317daf6e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317e104d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317e12c30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317e105f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317dd94c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177295e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317dae4e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317d63ec0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd317dae840> # zipimport: found 103 names in '/tmp/ansible_setup_payload_rznnixv2/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31778f2c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177721b0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317771340> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31778d160> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3177bebd0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177be960> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177be270> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177bed80> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31778fce0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3177bf920> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3177bfb60> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177e8050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31762ddf0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd31762fa10> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317634410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317635310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317637f50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317638140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3176362d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31763bf20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31763a9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31763a750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31763acc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3176367e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317680170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317680290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317681e50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317681c10> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3176843b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317682540> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317687b90> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317684560> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317688c50> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317688bf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317688ec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317680590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317514650> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3175158e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31768ade0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd31768b9b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31768a9c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317519af0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31751a870> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317515c10> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31751af90> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31751bb00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3175265a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317522f60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31760ed50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3177eea20> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317526360> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd31751cb90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b6b70> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317160500> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317160860> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175a0a70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b7680> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b5250> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b4e00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3171637d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317163080> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317163260> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171624b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317163950> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3171c2450> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171c0470> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3175b4f50> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171c2750> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171c3320> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3171fe6f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171ec9b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd3172161e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3171fc3e0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd317012de0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd3170127b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd317011280> {"ansible_facts": {"ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "57", "epoch": "1727203857", "epoch_int": "1727203857", "date": "2024-09-24", "time": "14:50:57", "iso8601_micro": "2024-09-24T18:50:57.228719Z", "iso8601": "2024-09-24T18:50:57Z", "iso8601_basic": "20240924T145057228719", "iso8601_basic_short": "20240924T145057", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15896 1727203857.33287: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203857.33291: _low_level_execute_command(): starting 15896 1727203857.33294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203856.5422974-16144-11339932657995/ > /dev/null 2>&1 && sleep 0' 15896 1727203857.33296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203857.33299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203857.33301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203857.33304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203857.33567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203857.36268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203857.36272: stdout chunk (state=3): >>><<< 15896 1727203857.36278: stderr chunk (state=3): >>><<< 15896 1727203857.36297: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203857.36308: handler run complete 15896 1727203857.36352: variable 'ansible_facts' from source: unknown 15896 1727203857.36413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203857.36794: variable 'ansible_facts' from source: unknown 15896 1727203857.36847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203857.37000: attempt loop complete, returning result 15896 1727203857.37004: _execute() done 15896 1727203857.37006: dumping result to json 15896 1727203857.37017: done dumping result, returning 15896 1727203857.37032: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [028d2410-947f-fb83-b6ad-0000000001cd] 15896 1727203857.37035: sending task result for task 028d2410-947f-fb83-b6ad-0000000001cd ok: [managed-node1] 15896 1727203857.37406: no more pending results, returning what we have 15896 1727203857.37409: results queue empty 15896 1727203857.37409: checking for any_errors_fatal 15896 1727203857.37415: done checking for any_errors_fatal 15896 1727203857.37416: checking for max_fail_percentage 15896 1727203857.37417: done checking for max_fail_percentage 15896 1727203857.37418: checking to see if all hosts have failed and the running result is not ok 15896 1727203857.37418: done checking to see if all hosts have failed 15896 1727203857.37419: getting the remaining hosts for this loop 15896 1727203857.37420: done getting the remaining hosts for this loop 15896 1727203857.37423: getting the next task for host managed-node1 15896 1727203857.37432: done getting next task for host managed-node1 15896 1727203857.37434: ^ task is: TASK: Check if system is ostree 15896 1727203857.37436: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203857.37439: getting variables 15896 1727203857.37441: in VariableManager get_vars() 15896 1727203857.37468: Calling all_inventory to load vars for managed-node1 15896 1727203857.37470: Calling groups_inventory to load vars for managed-node1 15896 1727203857.37473: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203857.37528: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001cd 15896 1727203857.37531: WORKER PROCESS EXITING 15896 1727203857.37542: Calling all_plugins_play to load vars for managed-node1 15896 1727203857.37545: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203857.37549: Calling groups_plugins_play to load vars for managed-node1 15896 1727203857.38206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203857.38601: done with get_vars() 15896 1727203857.38611: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:50:57 -0400 (0:00:00.979) 0:00:02.977 ***** 15896 1727203857.38787: entering _queue_task() for managed-node1/stat 15896 1727203857.39691: worker is 1 (out of 1 available) 15896 1727203857.39706: exiting _queue_task() for managed-node1/stat 15896 1727203857.39719: done queuing things up, now waiting for results queue to drain 15896 1727203857.39720: waiting for pending results... 15896 1727203857.39951: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 15896 1727203857.40188: in run() - task 028d2410-947f-fb83-b6ad-0000000001cf 15896 1727203857.40233: variable 'ansible_search_path' from source: unknown 15896 1727203857.40242: variable 'ansible_search_path' from source: unknown 15896 1727203857.40318: calling self._execute() 15896 1727203857.40544: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203857.40547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203857.40550: variable 'omit' from source: magic vars 15896 1727203857.41482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203857.41947: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203857.42019: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203857.42393: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203857.42580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203857.42586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203857.42632: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203857.42666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203857.43009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203857.43071: Evaluated conditional (not __network_is_ostree is defined): True 15896 1727203857.43097: variable 'omit' from source: magic vars 15896 1727203857.43228: variable 'omit' from source: magic vars 15896 1727203857.43399: variable 'omit' from source: magic vars 15896 1727203857.43448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203857.43557: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203857.43649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203857.43732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203857.44083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203857.44087: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203857.44089: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203857.44091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203857.44169: Set connection var ansible_shell_type to sh 15896 1727203857.44185: Set connection var ansible_connection to ssh 15896 1727203857.44581: Set connection var ansible_shell_executable to /bin/sh 15896 1727203857.44586: Set connection var ansible_pipelining to False 15896 1727203857.44589: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203857.44591: Set connection var ansible_timeout to 10 15896 1727203857.44593: variable 'ansible_shell_executable' from source: unknown 15896 1727203857.44595: variable 'ansible_connection' from source: unknown 15896 1727203857.44598: variable 'ansible_module_compression' from source: unknown 15896 1727203857.44600: variable 'ansible_shell_type' from source: unknown 15896 1727203857.44602: variable 'ansible_shell_executable' from source: unknown 15896 1727203857.44604: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203857.44606: variable 'ansible_pipelining' from source: unknown 15896 1727203857.44608: variable 'ansible_timeout' from source: unknown 15896 1727203857.44613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203857.44845: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203857.44866: variable 'omit' from source: magic vars 15896 1727203857.44877: starting attempt loop 15896 1727203857.44884: running the handler 15896 1727203857.44907: _low_level_execute_command(): starting 15896 1727203857.45184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203857.46384: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203857.46698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203857.46736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203857.46821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203857.49387: stdout chunk (state=3): >>>/root <<< 15896 1727203857.49729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203857.49892: stderr chunk (state=3): >>><<< 15896 1727203857.49907: stdout chunk (state=3): >>><<< 15896 1727203857.49947: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203857.49988: _low_level_execute_command(): starting 15896 1727203857.50006: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020 `" && echo ansible-tmp-1727203857.499688-16232-164894795110020="` echo /root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020 `" ) && sleep 0' 15896 1727203857.52063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203857.52068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203857.52082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203857.52300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203857.52390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203857.52402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203857.52534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203857.55401: stdout chunk (state=3): >>>ansible-tmp-1727203857.499688-16232-164894795110020=/root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020 <<< 15896 1727203857.55612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203857.55981: stderr chunk (state=3): >>><<< 15896 1727203857.55984: stdout chunk (state=3): >>><<< 15896 1727203857.55987: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203857.499688-16232-164894795110020=/root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203857.55989: variable 'ansible_module_compression' from source: unknown 15896 1727203857.55991: ANSIBALLZ: Using lock for stat 15896 1727203857.55993: ANSIBALLZ: Acquiring lock 15896 1727203857.55994: ANSIBALLZ: Lock acquired: 140082272719200 15896 1727203857.55997: ANSIBALLZ: Creating module 15896 1727203857.85566: ANSIBALLZ: Writing module into payload 15896 1727203857.86039: ANSIBALLZ: Writing module 15896 1727203857.86105: ANSIBALLZ: Renaming module 15896 1727203857.86118: ANSIBALLZ: Done creating module 15896 1727203857.86147: variable 'ansible_facts' from source: unknown 15896 1727203857.86336: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/AnsiballZ_stat.py 15896 1727203857.86807: Sending initial data 15896 1727203857.86810: Sent initial data (152 bytes) 15896 1727203857.88594: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203857.88769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203857.89195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203857.91512: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15896 1727203857.91531: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203857.91600: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203857.91686: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp2fkg_v01 /root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/AnsiballZ_stat.py <<< 15896 1727203857.91794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp2fkg_v01" to remote "/root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/AnsiballZ_stat.py" <<< 15896 1727203857.93625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203857.93628: stderr chunk (state=3): >>><<< 15896 1727203857.93633: stdout chunk (state=3): >>><<< 15896 1727203857.93635: done transferring module to remote 15896 1727203857.93637: _low_level_execute_command(): starting 15896 1727203857.93639: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/ /root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/AnsiballZ_stat.py && sleep 0' 15896 1727203857.94626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203857.94648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203857.94651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203857.94655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203857.94789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203857.94839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203857.97620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203857.97684: stderr chunk (state=3): >>><<< 15896 1727203857.97694: stdout chunk (state=3): >>><<< 15896 1727203857.97870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203857.97873: _low_level_execute_command(): starting 15896 1727203857.97877: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/AnsiballZ_stat.py && sleep 0' 15896 1727203857.99073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203857.99215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203857.99228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203857.99346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203858.02731: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15896 1727203858.02900: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15896 1727203858.03019: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 15896 1727203858.03024: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15896 1727203858.03046: stdout chunk (state=3): >>>import 'time' # <<< 15896 1727203858.03063: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 15896 1727203858.03132: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203858.03167: stdout chunk (state=3): >>>import '_codecs' # <<< 15896 1727203858.03197: stdout chunk (state=3): >>>import 'codecs' # <<< 15896 1727203858.03328: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15896 1727203858.03492: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb3104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb2dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb312a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 15896 1727203858.03570: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15896 1727203858.03618: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15896 1727203858.03655: stdout chunk (state=3): >>>import 'os' # <<< 15896 1727203858.03693: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 15896 1727203858.03707: stdout chunk (state=3): >>>Processing global site-packages <<< 15896 1727203858.03764: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15896 1727203858.03797: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15896 1727203858.03827: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb105130> <<< 15896 1727203858.04090: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb106060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15896 1727203858.04409: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15896 1727203858.04421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15896 1727203858.04457: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15896 1727203858.04615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15896 1727203858.04636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15896 1727203858.04790: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb143f80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb158110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15896 1727203858.04840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203858.04870: stdout chunk (state=3): >>>import 'itertools' # <<< 15896 1727203858.04897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 15896 1727203858.04957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb17b950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb17bfe0> <<< 15896 1727203858.04972: stdout chunk (state=3): >>>import '_collections' # <<< 15896 1727203858.05065: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb15bbf0> import '_functools' # <<< 15896 1727203858.05108: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1592e0> <<< 15896 1727203858.05252: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb141130> <<< 15896 1727203858.05391: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15896 1727203858.05406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15896 1727203858.05545: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb19f8f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb19e510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb15a390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb19cd40> <<< 15896 1727203858.05611: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15896 1727203858.05629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cc950> <<< 15896 1727203858.05641: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1403b0> <<< 15896 1727203858.05715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1cce00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cccb0> <<< 15896 1727203858.05823: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1cd0a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb13eed0> <<< 15896 1727203858.05830: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203858.05930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15896 1727203858.06115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cd790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cd460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1ce660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1e8890> <<< 15896 1727203858.06196: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1e9fd0> <<< 15896 1727203858.06272: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15896 1727203858.06277: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1eae70> <<< 15896 1727203858.06326: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203858.06370: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1eb4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1ea3c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15896 1727203858.06477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203858.06517: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1ebe60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1eb590> <<< 15896 1727203858.06540: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1ce6c0> <<< 15896 1727203858.06582: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15896 1727203858.06691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15896 1727203858.06803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf67d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf90860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf905c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf90890> <<< 15896 1727203858.06823: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15896 1727203858.06836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15896 1727203858.06931: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203858.07119: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf911c0> <<< 15896 1727203858.07329: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203858.07398: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf91bb0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf90a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf65eb0> <<< 15896 1727203858.07401: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15896 1727203858.07433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15896 1727203858.07691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf92f90> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf91d00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cedb0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15896 1727203858.07897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bafbb2f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203858.08083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bafdf6e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15896 1727203858.08168: stdout chunk (state=3): >>>import 'ntpath' # <<< 15896 1727203858.08194: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 15896 1727203858.08209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb0404d0> <<< 15896 1727203858.08232: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15896 1727203858.08272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15896 1727203858.08353: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15896 1727203858.08368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15896 1727203858.08701: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb042c30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb0405f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb0094c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 15896 1727203858.08716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9295e0> <<< 15896 1727203858.08736: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bafde4e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf93ec0> <<< 15896 1727203858.08998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe0bafde840> <<< 15896 1727203858.09178: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_yfe1rxbz/ansible_stat_payload.zip' <<< 15896 1727203858.09290: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.09426: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.09448: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15896 1727203858.09731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba97b2c0> import '_typing' # <<< 15896 1727203858.09969: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba95e1b0> <<< 15896 1727203858.09984: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba95d340> # zipimport: zlib available <<< 15896 1727203858.10035: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 15896 1727203858.10088: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.10139: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 15896 1727203858.12757: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.14728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba979190> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15896 1727203858.14732: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba9a6b40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a68d0> <<< 15896 1727203858.14789: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a61e0> <<< 15896 1727203858.15045: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a6cf0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba97bf50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba9a78c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba9a7b00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 15896 1727203858.15153: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a7f80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15896 1727203858.15191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15896 1727203858.15482: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba811dc0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba8139e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba8183e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15896 1727203858.15494: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba8192e0> <<< 15896 1727203858.15525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15896 1727203858.15573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15896 1727203858.15605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 15896 1727203858.15616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15896 1727203858.16002: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba81bf80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba820170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba81a300> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba823f50> <<< 15896 1727203858.16005: stdout chunk (state=3): >>>import '_tokenize' # <<< 15896 1727203858.16108: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba822a20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba822780> <<< 15896 1727203858.16140: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 15896 1727203858.16143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15896 1727203858.16254: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba822cf0> <<< 15896 1727203858.16539: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba81a810> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba867f80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba8682c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba869d90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba869b50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15896 1727203858.16685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15896 1727203858.16788: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba86c2c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba86a480> <<< 15896 1727203858.16801: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15896 1727203858.16869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203858.16898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15896 1727203858.16916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15896 1727203858.16997: stdout chunk (state=3): >>>import '_string' # <<< 15896 1727203858.17097: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba86fa70> <<< 15896 1727203858.17310: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba86c440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba870b60> <<< 15896 1727203858.17501: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba870950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba870bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba868470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15896 1727203858.17599: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15896 1727203858.17787: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba8fc410> <<< 15896 1727203858.17864: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba8fd5b0> <<< 15896 1727203858.17907: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba872ba0> <<< 15896 1727203858.17923: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba873a70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba8727e0> <<< 15896 1727203858.17993: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 15896 1727203858.18014: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.18185: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.18315: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.18321: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 15896 1727203858.18414: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.18431: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15896 1727203858.18598: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.18787: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.20021: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.20942: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203858.21066: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba701670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba702390> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a6ab0> <<< 15896 1727203858.21177: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 15896 1727203858.21287: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15896 1727203858.21447: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.21829: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba702480> # zipimport: zlib available <<< 15896 1727203858.22534: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.23399: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.23625: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 15896 1727203858.23708: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.23722: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.23845: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15896 1727203858.23913: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15896 1727203858.23985: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.24017: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15896 1727203858.24056: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.24480: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.25035: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 15896 1727203858.25118: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba703560> # zipimport: zlib available <<< 15896 1727203858.25219: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.25277: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 15896 1727203858.25281: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 15896 1727203858.25292: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 15896 1727203858.25325: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.25382: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.25472: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 15896 1727203858.25516: stdout chunk (state=3): >>># zipimport: zlib available<<< 15896 1727203858.25519: stdout chunk (state=3): >>> <<< 15896 1727203858.25644: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.25790: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15896 1727203858.25811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15896 1727203858.25947: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba70e120> <<< 15896 1727203858.26005: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba708ef0> <<< 15896 1727203858.26090: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15896 1727203858.26157: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.26248: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.26517: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15896 1727203858.26693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9fe9c0> <<< 15896 1727203858.26737: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9ee690> <<< 15896 1727203858.26853: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba70df70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba872b10> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15896 1727203858.26856: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15896 1727203858.26881: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.26962: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15896 1727203858.27003: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15896 1727203858.27023: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.27067: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 15896 1727203858.27170: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.27305: stdout chunk (state=3): >>># zipimport: zlib available <<< 15896 1727203858.27909: stdout chunk (state=3): >>># zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 15896 1727203858.28427: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__<<< 15896 1727203858.28431: stdout chunk (state=3): >>> # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external<<< 15896 1727203858.28434: stdout chunk (state=3): >>> # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc <<< 15896 1727203858.28444: stdout chunk (state=3): >>># cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath <<< 15896 1727203858.28841: stdout chunk (state=3): >>># cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing syste<<< 15896 1727203858.28849: stdout chunk (state=3): >>>md.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15896 1727203858.29171: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15896 1727203858.29180: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15896 1727203858.29232: stdout chunk (state=3): >>># destroy _bz2 <<< 15896 1727203858.29249: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 15896 1727203858.29394: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 15896 1727203858.29573: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal <<< 15896 1727203858.29590: stdout chunk (state=3): >>># cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 15896 1727203858.29892: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15896 1727203858.30124: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 15896 1727203858.30192: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 15896 1727203858.30222: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15896 1727203858.30363: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io <<< 15896 1727203858.30411: stdout chunk (state=3): >>># destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 15896 1727203858.30544: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 15896 1727203858.30762: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15896 1727203858.30980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203858.31130: stderr chunk (state=3): >>><<< 15896 1727203858.31139: stdout chunk (state=3): >>><<< 15896 1727203858.31332: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb3104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb2dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb312a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb105130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb106060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb143f80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb158110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb17b950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb17bfe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb15bbf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1592e0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb141130> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb19f8f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb19e510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb15a390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb19cd40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cc950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1403b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1cce00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cccb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1cd0a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb13eed0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cd790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cd460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1ce660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1e8890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1e9fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1eae70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1eb4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1ea3c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0bb1ebe60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1eb590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1ce6c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf67d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf90860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf905c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf90890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf911c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0baf91bb0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf90a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf65eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf92f90> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf91d00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb1cedb0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bafbb2f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bafdf6e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb0404d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb042c30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb0405f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bb0094c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9295e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0bafde4e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0baf93ec0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe0bafde840> # zipimport: found 30 names in '/tmp/ansible_stat_payload_yfe1rxbz/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba97b2c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba95e1b0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba95d340> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba979190> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba9a6b40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a68d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a61e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a6cf0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba97bf50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba9a78c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba9a7b00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a7f80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba811dc0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba8139e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba8183e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba8192e0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba81bf80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba820170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba81a300> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba823f50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba822a20> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba822780> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba822cf0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba81a810> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba867f80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba8682c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba869d90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba869b50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba86c2c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba86a480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba86fa70> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba86c440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba870b60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba870950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba870bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba868470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba8fc410> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba8fd5b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba872ba0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba873a70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba8727e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba701670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba702390> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9a6ab0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba702480> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba703560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe0ba70e120> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba708ef0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9fe9c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba9ee690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba70df70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe0ba872b10> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15896 1727203858.32625: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203858.32628: _low_level_execute_command(): starting 15896 1727203858.32630: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203857.499688-16232-164894795110020/ > /dev/null 2>&1 && sleep 0' 15896 1727203858.33218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203858.33233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203858.33460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203858.33496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203858.33536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203858.33620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203858.36435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203858.36449: stdout chunk (state=3): >>><<< 15896 1727203858.36461: stderr chunk (state=3): >>><<< 15896 1727203858.36699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203858.36702: handler run complete 15896 1727203858.36704: attempt loop complete, returning result 15896 1727203858.36706: _execute() done 15896 1727203858.36708: dumping result to json 15896 1727203858.36711: done dumping result, returning 15896 1727203858.36712: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [028d2410-947f-fb83-b6ad-0000000001cf] 15896 1727203858.36715: sending task result for task 028d2410-947f-fb83-b6ad-0000000001cf 15896 1727203858.36779: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001cf 15896 1727203858.36782: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 15896 1727203858.36846: no more pending results, returning what we have 15896 1727203858.36848: results queue empty 15896 1727203858.36849: checking for any_errors_fatal 15896 1727203858.36856: done checking for any_errors_fatal 15896 1727203858.36857: checking for max_fail_percentage 15896 1727203858.36858: done checking for max_fail_percentage 15896 1727203858.36859: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.36859: done checking to see if all hosts have failed 15896 1727203858.36860: getting the remaining hosts for this loop 15896 1727203858.36861: done getting the remaining hosts for this loop 15896 1727203858.36865: getting the next task for host managed-node1 15896 1727203858.36870: done getting next task for host managed-node1 15896 1727203858.36872: ^ task is: TASK: Set flag to indicate system is ostree 15896 1727203858.36977: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.36982: getting variables 15896 1727203858.36984: in VariableManager get_vars() 15896 1727203858.37013: Calling all_inventory to load vars for managed-node1 15896 1727203858.37015: Calling groups_inventory to load vars for managed-node1 15896 1727203858.37137: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.37147: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.37150: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.37153: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.37603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.38007: done with get_vars() 15896 1727203858.38019: done getting variables 15896 1727203858.38437: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.996) 0:00:03.973 ***** 15896 1727203858.38465: entering _queue_task() for managed-node1/set_fact 15896 1727203858.38467: Creating lock for set_fact 15896 1727203858.39291: worker is 1 (out of 1 available) 15896 1727203858.39309: exiting _queue_task() for managed-node1/set_fact 15896 1727203858.39319: done queuing things up, now waiting for results queue to drain 15896 1727203858.39321: waiting for pending results... 15896 1727203858.39994: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 15896 1727203858.39998: in run() - task 028d2410-947f-fb83-b6ad-0000000001d0 15896 1727203858.40036: variable 'ansible_search_path' from source: unknown 15896 1727203858.40040: variable 'ansible_search_path' from source: unknown 15896 1727203858.40108: calling self._execute() 15896 1727203858.40357: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.40365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.40387: variable 'omit' from source: magic vars 15896 1727203858.41369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203858.42124: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203858.42181: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203858.42233: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203858.42277: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203858.42382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203858.42430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203858.42467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203858.42499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203858.42634: Evaluated conditional (not __network_is_ostree is defined): True 15896 1727203858.42663: variable 'omit' from source: magic vars 15896 1727203858.42706: variable 'omit' from source: magic vars 15896 1727203858.42836: variable '__ostree_booted_stat' from source: set_fact 15896 1727203858.42978: variable 'omit' from source: magic vars 15896 1727203858.42981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203858.42984: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203858.42992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203858.43013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203858.43027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203858.43061: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203858.43070: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.43093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.43209: Set connection var ansible_shell_type to sh 15896 1727203858.43221: Set connection var ansible_connection to ssh 15896 1727203858.43281: Set connection var ansible_shell_executable to /bin/sh 15896 1727203858.43283: Set connection var ansible_pipelining to False 15896 1727203858.43286: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203858.43288: Set connection var ansible_timeout to 10 15896 1727203858.43290: variable 'ansible_shell_executable' from source: unknown 15896 1727203858.43292: variable 'ansible_connection' from source: unknown 15896 1727203858.43303: variable 'ansible_module_compression' from source: unknown 15896 1727203858.43305: variable 'ansible_shell_type' from source: unknown 15896 1727203858.43306: variable 'ansible_shell_executable' from source: unknown 15896 1727203858.43310: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.43322: variable 'ansible_pipelining' from source: unknown 15896 1727203858.43328: variable 'ansible_timeout' from source: unknown 15896 1727203858.43335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.43449: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203858.43467: variable 'omit' from source: magic vars 15896 1727203858.43522: starting attempt loop 15896 1727203858.43525: running the handler 15896 1727203858.43527: handler run complete 15896 1727203858.43531: attempt loop complete, returning result 15896 1727203858.43533: _execute() done 15896 1727203858.43535: dumping result to json 15896 1727203858.43537: done dumping result, returning 15896 1727203858.43539: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [028d2410-947f-fb83-b6ad-0000000001d0] 15896 1727203858.43630: sending task result for task 028d2410-947f-fb83-b6ad-0000000001d0 15896 1727203858.43711: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001d0 15896 1727203858.43714: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15896 1727203858.43813: no more pending results, returning what we have 15896 1727203858.43816: results queue empty 15896 1727203858.43817: checking for any_errors_fatal 15896 1727203858.43825: done checking for any_errors_fatal 15896 1727203858.43826: checking for max_fail_percentage 15896 1727203858.43828: done checking for max_fail_percentage 15896 1727203858.43829: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.43830: done checking to see if all hosts have failed 15896 1727203858.43831: getting the remaining hosts for this loop 15896 1727203858.43832: done getting the remaining hosts for this loop 15896 1727203858.43836: getting the next task for host managed-node1 15896 1727203858.43964: done getting next task for host managed-node1 15896 1727203858.43967: ^ task is: TASK: Fix CentOS6 Base repo 15896 1727203858.43970: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.43974: getting variables 15896 1727203858.43977: in VariableManager get_vars() 15896 1727203858.44009: Calling all_inventory to load vars for managed-node1 15896 1727203858.44011: Calling groups_inventory to load vars for managed-node1 15896 1727203858.44015: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.44027: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.44030: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.44039: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.44453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.44663: done with get_vars() 15896 1727203858.44672: done getting variables 15896 1727203858.44800: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.063) 0:00:04.037 ***** 15896 1727203858.44836: entering _queue_task() for managed-node1/copy 15896 1727203858.45120: worker is 1 (out of 1 available) 15896 1727203858.45132: exiting _queue_task() for managed-node1/copy 15896 1727203858.45143: done queuing things up, now waiting for results queue to drain 15896 1727203858.45146: waiting for pending results... 15896 1727203858.45608: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 15896 1727203858.45612: in run() - task 028d2410-947f-fb83-b6ad-0000000001d2 15896 1727203858.45615: variable 'ansible_search_path' from source: unknown 15896 1727203858.45618: variable 'ansible_search_path' from source: unknown 15896 1727203858.45624: calling self._execute() 15896 1727203858.45717: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.45727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.45740: variable 'omit' from source: magic vars 15896 1727203858.46177: variable 'ansible_distribution' from source: facts 15896 1727203858.46207: Evaluated conditional (ansible_distribution == 'CentOS'): True 15896 1727203858.46354: variable 'ansible_distribution_major_version' from source: facts 15896 1727203858.46369: Evaluated conditional (ansible_distribution_major_version == '6'): False 15896 1727203858.46428: when evaluation is False, skipping this task 15896 1727203858.46432: _execute() done 15896 1727203858.46434: dumping result to json 15896 1727203858.46436: done dumping result, returning 15896 1727203858.46439: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [028d2410-947f-fb83-b6ad-0000000001d2] 15896 1727203858.46443: sending task result for task 028d2410-947f-fb83-b6ad-0000000001d2 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15896 1727203858.46691: no more pending results, returning what we have 15896 1727203858.46694: results queue empty 15896 1727203858.46695: checking for any_errors_fatal 15896 1727203858.46700: done checking for any_errors_fatal 15896 1727203858.46700: checking for max_fail_percentage 15896 1727203858.46702: done checking for max_fail_percentage 15896 1727203858.46703: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.46704: done checking to see if all hosts have failed 15896 1727203858.46704: getting the remaining hosts for this loop 15896 1727203858.46706: done getting the remaining hosts for this loop 15896 1727203858.46709: getting the next task for host managed-node1 15896 1727203858.46716: done getting next task for host managed-node1 15896 1727203858.46718: ^ task is: TASK: Include the task 'enable_epel.yml' 15896 1727203858.46721: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.46725: getting variables 15896 1727203858.46726: in VariableManager get_vars() 15896 1727203858.46755: Calling all_inventory to load vars for managed-node1 15896 1727203858.46758: Calling groups_inventory to load vars for managed-node1 15896 1727203858.46765: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.46891: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.46895: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.46900: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001d2 15896 1727203858.46902: WORKER PROCESS EXITING 15896 1727203858.46906: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.47163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.47585: done with get_vars() 15896 1727203858.47595: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.029) 0:00:04.067 ***** 15896 1727203858.47805: entering _queue_task() for managed-node1/include_tasks 15896 1727203858.48644: worker is 1 (out of 1 available) 15896 1727203858.48656: exiting _queue_task() for managed-node1/include_tasks 15896 1727203858.48671: done queuing things up, now waiting for results queue to drain 15896 1727203858.48673: waiting for pending results... 15896 1727203858.49134: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 15896 1727203858.49331: in run() - task 028d2410-947f-fb83-b6ad-0000000001d3 15896 1727203858.49507: variable 'ansible_search_path' from source: unknown 15896 1727203858.49512: variable 'ansible_search_path' from source: unknown 15896 1727203858.49563: calling self._execute() 15896 1727203858.49718: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.49729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.49743: variable 'omit' from source: magic vars 15896 1727203858.50820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203858.53584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203858.53679: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203858.53732: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203858.53773: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203858.53817: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203858.53927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203858.53957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203858.53992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203858.54054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203858.54144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203858.54207: variable '__network_is_ostree' from source: set_fact 15896 1727203858.54229: Evaluated conditional (not __network_is_ostree | d(false)): True 15896 1727203858.54239: _execute() done 15896 1727203858.54247: dumping result to json 15896 1727203858.54258: done dumping result, returning 15896 1727203858.54283: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [028d2410-947f-fb83-b6ad-0000000001d3] 15896 1727203858.54292: sending task result for task 028d2410-947f-fb83-b6ad-0000000001d3 15896 1727203858.54544: no more pending results, returning what we have 15896 1727203858.54549: in VariableManager get_vars() 15896 1727203858.54589: Calling all_inventory to load vars for managed-node1 15896 1727203858.54592: Calling groups_inventory to load vars for managed-node1 15896 1727203858.54596: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.54607: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.54610: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.54612: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.54915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.55258: done with get_vars() 15896 1727203858.55270: variable 'ansible_search_path' from source: unknown 15896 1727203858.55271: variable 'ansible_search_path' from source: unknown 15896 1727203858.55300: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001d3 15896 1727203858.55303: WORKER PROCESS EXITING 15896 1727203858.55331: we have included files to process 15896 1727203858.55332: generating all_blocks data 15896 1727203858.55333: done generating all_blocks data 15896 1727203858.55338: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15896 1727203858.55339: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15896 1727203858.55342: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15896 1727203858.56114: done processing included file 15896 1727203858.56117: iterating over new_blocks loaded from include file 15896 1727203858.56118: in VariableManager get_vars() 15896 1727203858.56131: done with get_vars() 15896 1727203858.56132: filtering new block on tags 15896 1727203858.56169: done filtering new block on tags 15896 1727203858.56172: in VariableManager get_vars() 15896 1727203858.56184: done with get_vars() 15896 1727203858.56186: filtering new block on tags 15896 1727203858.56198: done filtering new block on tags 15896 1727203858.56200: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 15896 1727203858.56205: extending task lists for all hosts with included blocks 15896 1727203858.56325: done extending task lists 15896 1727203858.56327: done processing included files 15896 1727203858.56327: results queue empty 15896 1727203858.56328: checking for any_errors_fatal 15896 1727203858.56332: done checking for any_errors_fatal 15896 1727203858.56332: checking for max_fail_percentage 15896 1727203858.56333: done checking for max_fail_percentage 15896 1727203858.56334: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.56335: done checking to see if all hosts have failed 15896 1727203858.56335: getting the remaining hosts for this loop 15896 1727203858.56337: done getting the remaining hosts for this loop 15896 1727203858.56339: getting the next task for host managed-node1 15896 1727203858.56343: done getting next task for host managed-node1 15896 1727203858.56345: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15896 1727203858.56347: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.56349: getting variables 15896 1727203858.56350: in VariableManager get_vars() 15896 1727203858.56358: Calling all_inventory to load vars for managed-node1 15896 1727203858.56362: Calling groups_inventory to load vars for managed-node1 15896 1727203858.56366: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.56381: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.56389: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.56393: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.56554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.56775: done with get_vars() 15896 1727203858.56784: done getting variables 15896 1727203858.56862: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15896 1727203858.56997: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.092) 0:00:04.159 ***** 15896 1727203858.57056: entering _queue_task() for managed-node1/command 15896 1727203858.57057: Creating lock for command 15896 1727203858.57730: worker is 1 (out of 1 available) 15896 1727203858.57741: exiting _queue_task() for managed-node1/command 15896 1727203858.57754: done queuing things up, now waiting for results queue to drain 15896 1727203858.57756: waiting for pending results... 15896 1727203858.58437: running TaskExecutor() for managed-node1/TASK: Create EPEL 10 15896 1727203858.58607: in run() - task 028d2410-947f-fb83-b6ad-0000000001ed 15896 1727203858.58627: variable 'ansible_search_path' from source: unknown 15896 1727203858.58639: variable 'ansible_search_path' from source: unknown 15896 1727203858.58694: calling self._execute() 15896 1727203858.58784: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.58800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.58814: variable 'omit' from source: magic vars 15896 1727203858.59230: variable 'ansible_distribution' from source: facts 15896 1727203858.59243: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15896 1727203858.59687: variable 'ansible_distribution_major_version' from source: facts 15896 1727203858.59691: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15896 1727203858.59694: when evaluation is False, skipping this task 15896 1727203858.59696: _execute() done 15896 1727203858.59698: dumping result to json 15896 1727203858.59700: done dumping result, returning 15896 1727203858.59702: done running TaskExecutor() for managed-node1/TASK: Create EPEL 10 [028d2410-947f-fb83-b6ad-0000000001ed] 15896 1727203858.59705: sending task result for task 028d2410-947f-fb83-b6ad-0000000001ed 15896 1727203858.59857: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001ed 15896 1727203858.59862: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15896 1727203858.59941: no more pending results, returning what we have 15896 1727203858.59944: results queue empty 15896 1727203858.59945: checking for any_errors_fatal 15896 1727203858.59946: done checking for any_errors_fatal 15896 1727203858.59947: checking for max_fail_percentage 15896 1727203858.59949: done checking for max_fail_percentage 15896 1727203858.59950: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.59950: done checking to see if all hosts have failed 15896 1727203858.59951: getting the remaining hosts for this loop 15896 1727203858.59953: done getting the remaining hosts for this loop 15896 1727203858.59956: getting the next task for host managed-node1 15896 1727203858.59966: done getting next task for host managed-node1 15896 1727203858.59968: ^ task is: TASK: Install yum-utils package 15896 1727203858.59979: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.59985: getting variables 15896 1727203858.59986: in VariableManager get_vars() 15896 1727203858.60016: Calling all_inventory to load vars for managed-node1 15896 1727203858.60018: Calling groups_inventory to load vars for managed-node1 15896 1727203858.60022: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.60036: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.60039: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.60042: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.60601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.61220: done with get_vars() 15896 1727203858.61231: done getting variables 15896 1727203858.61481: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.044) 0:00:04.204 ***** 15896 1727203858.61521: entering _queue_task() for managed-node1/package 15896 1727203858.61523: Creating lock for package 15896 1727203858.62218: worker is 1 (out of 1 available) 15896 1727203858.62231: exiting _queue_task() for managed-node1/package 15896 1727203858.62242: done queuing things up, now waiting for results queue to drain 15896 1727203858.62244: waiting for pending results... 15896 1727203858.62773: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 15896 1727203858.62779: in run() - task 028d2410-947f-fb83-b6ad-0000000001ee 15896 1727203858.62782: variable 'ansible_search_path' from source: unknown 15896 1727203858.62784: variable 'ansible_search_path' from source: unknown 15896 1727203858.63073: calling self._execute() 15896 1727203858.63170: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.63183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.63196: variable 'omit' from source: magic vars 15896 1727203858.63683: variable 'ansible_distribution' from source: facts 15896 1727203858.63771: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15896 1727203858.63837: variable 'ansible_distribution_major_version' from source: facts 15896 1727203858.63850: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15896 1727203858.63863: when evaluation is False, skipping this task 15896 1727203858.63872: _execute() done 15896 1727203858.63897: dumping result to json 15896 1727203858.63907: done dumping result, returning 15896 1727203858.63919: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [028d2410-947f-fb83-b6ad-0000000001ee] 15896 1727203858.63996: sending task result for task 028d2410-947f-fb83-b6ad-0000000001ee 15896 1727203858.64086: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001ee 15896 1727203858.64090: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15896 1727203858.64157: no more pending results, returning what we have 15896 1727203858.64162: results queue empty 15896 1727203858.64163: checking for any_errors_fatal 15896 1727203858.64169: done checking for any_errors_fatal 15896 1727203858.64170: checking for max_fail_percentage 15896 1727203858.64172: done checking for max_fail_percentage 15896 1727203858.64172: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.64173: done checking to see if all hosts have failed 15896 1727203858.64174: getting the remaining hosts for this loop 15896 1727203858.64177: done getting the remaining hosts for this loop 15896 1727203858.64183: getting the next task for host managed-node1 15896 1727203858.64193: done getting next task for host managed-node1 15896 1727203858.64196: ^ task is: TASK: Enable EPEL 7 15896 1727203858.64202: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.64208: getting variables 15896 1727203858.64210: in VariableManager get_vars() 15896 1727203858.64481: Calling all_inventory to load vars for managed-node1 15896 1727203858.64487: Calling groups_inventory to load vars for managed-node1 15896 1727203858.64492: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.64506: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.64513: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.64517: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.65000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.65228: done with get_vars() 15896 1727203858.65238: done getting variables 15896 1727203858.65313: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.038) 0:00:04.243 ***** 15896 1727203858.65394: entering _queue_task() for managed-node1/command 15896 1727203858.65897: worker is 1 (out of 1 available) 15896 1727203858.65915: exiting _queue_task() for managed-node1/command 15896 1727203858.65926: done queuing things up, now waiting for results queue to drain 15896 1727203858.65927: waiting for pending results... 15896 1727203858.66251: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 15896 1727203858.66410: in run() - task 028d2410-947f-fb83-b6ad-0000000001ef 15896 1727203858.66417: variable 'ansible_search_path' from source: unknown 15896 1727203858.66423: variable 'ansible_search_path' from source: unknown 15896 1727203858.66425: calling self._execute() 15896 1727203858.66560: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.66572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.66620: variable 'omit' from source: magic vars 15896 1727203858.67187: variable 'ansible_distribution' from source: facts 15896 1727203858.67191: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15896 1727203858.67336: variable 'ansible_distribution_major_version' from source: facts 15896 1727203858.67352: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15896 1727203858.67361: when evaluation is False, skipping this task 15896 1727203858.67368: _execute() done 15896 1727203858.67377: dumping result to json 15896 1727203858.67385: done dumping result, returning 15896 1727203858.67394: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [028d2410-947f-fb83-b6ad-0000000001ef] 15896 1727203858.67420: sending task result for task 028d2410-947f-fb83-b6ad-0000000001ef skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15896 1727203858.67699: no more pending results, returning what we have 15896 1727203858.67703: results queue empty 15896 1727203858.67703: checking for any_errors_fatal 15896 1727203858.67710: done checking for any_errors_fatal 15896 1727203858.67711: checking for max_fail_percentage 15896 1727203858.67712: done checking for max_fail_percentage 15896 1727203858.67715: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.67716: done checking to see if all hosts have failed 15896 1727203858.67717: getting the remaining hosts for this loop 15896 1727203858.67719: done getting the remaining hosts for this loop 15896 1727203858.67722: getting the next task for host managed-node1 15896 1727203858.67729: done getting next task for host managed-node1 15896 1727203858.67734: ^ task is: TASK: Enable EPEL 8 15896 1727203858.67739: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.67744: getting variables 15896 1727203858.67746: in VariableManager get_vars() 15896 1727203858.67780: Calling all_inventory to load vars for managed-node1 15896 1727203858.67783: Calling groups_inventory to load vars for managed-node1 15896 1727203858.67787: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.67802: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.67806: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.67809: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.68230: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001ef 15896 1727203858.68233: WORKER PROCESS EXITING 15896 1727203858.68257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.68541: done with get_vars() 15896 1727203858.68552: done getting variables 15896 1727203858.68611: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.032) 0:00:04.275 ***** 15896 1727203858.68656: entering _queue_task() for managed-node1/command 15896 1727203858.69012: worker is 1 (out of 1 available) 15896 1727203858.69023: exiting _queue_task() for managed-node1/command 15896 1727203858.69036: done queuing things up, now waiting for results queue to drain 15896 1727203858.69038: waiting for pending results... 15896 1727203858.69285: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 15896 1727203858.69459: in run() - task 028d2410-947f-fb83-b6ad-0000000001f0 15896 1727203858.69525: variable 'ansible_search_path' from source: unknown 15896 1727203858.69604: variable 'ansible_search_path' from source: unknown 15896 1727203858.69682: calling self._execute() 15896 1727203858.69726: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.69730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.69748: variable 'omit' from source: magic vars 15896 1727203858.70159: variable 'ansible_distribution' from source: facts 15896 1727203858.70180: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15896 1727203858.70316: variable 'ansible_distribution_major_version' from source: facts 15896 1727203858.70321: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15896 1727203858.70324: when evaluation is False, skipping this task 15896 1727203858.70327: _execute() done 15896 1727203858.70330: dumping result to json 15896 1727203858.70480: done dumping result, returning 15896 1727203858.70483: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [028d2410-947f-fb83-b6ad-0000000001f0] 15896 1727203858.70485: sending task result for task 028d2410-947f-fb83-b6ad-0000000001f0 15896 1727203858.70545: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001f0 15896 1727203858.70547: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15896 1727203858.70613: no more pending results, returning what we have 15896 1727203858.70615: results queue empty 15896 1727203858.70616: checking for any_errors_fatal 15896 1727203858.70623: done checking for any_errors_fatal 15896 1727203858.70623: checking for max_fail_percentage 15896 1727203858.70625: done checking for max_fail_percentage 15896 1727203858.70625: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.70626: done checking to see if all hosts have failed 15896 1727203858.70626: getting the remaining hosts for this loop 15896 1727203858.70628: done getting the remaining hosts for this loop 15896 1727203858.70630: getting the next task for host managed-node1 15896 1727203858.70637: done getting next task for host managed-node1 15896 1727203858.70639: ^ task is: TASK: Enable EPEL 6 15896 1727203858.70643: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.70646: getting variables 15896 1727203858.70647: in VariableManager get_vars() 15896 1727203858.70672: Calling all_inventory to load vars for managed-node1 15896 1727203858.70674: Calling groups_inventory to load vars for managed-node1 15896 1727203858.70679: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.70696: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.70699: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.70701: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.70956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.71201: done with get_vars() 15896 1727203858.71210: done getting variables 15896 1727203858.71316: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.026) 0:00:04.302 ***** 15896 1727203858.71354: entering _queue_task() for managed-node1/copy 15896 1727203858.71713: worker is 1 (out of 1 available) 15896 1727203858.71726: exiting _queue_task() for managed-node1/copy 15896 1727203858.71738: done queuing things up, now waiting for results queue to drain 15896 1727203858.71740: waiting for pending results... 15896 1727203858.72009: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 15896 1727203858.72142: in run() - task 028d2410-947f-fb83-b6ad-0000000001f2 15896 1727203858.72160: variable 'ansible_search_path' from source: unknown 15896 1727203858.72164: variable 'ansible_search_path' from source: unknown 15896 1727203858.72201: calling self._execute() 15896 1727203858.72296: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.72300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.72311: variable 'omit' from source: magic vars 15896 1727203858.72781: variable 'ansible_distribution' from source: facts 15896 1727203858.72887: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15896 1727203858.73162: variable 'ansible_distribution_major_version' from source: facts 15896 1727203858.73188: Evaluated conditional (ansible_distribution_major_version == '6'): False 15896 1727203858.73228: when evaluation is False, skipping this task 15896 1727203858.73231: _execute() done 15896 1727203858.73308: dumping result to json 15896 1727203858.73338: done dumping result, returning 15896 1727203858.73347: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [028d2410-947f-fb83-b6ad-0000000001f2] 15896 1727203858.73364: sending task result for task 028d2410-947f-fb83-b6ad-0000000001f2 15896 1727203858.73547: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001f2 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15896 1727203858.73726: no more pending results, returning what we have 15896 1727203858.73729: results queue empty 15896 1727203858.73730: checking for any_errors_fatal 15896 1727203858.73739: done checking for any_errors_fatal 15896 1727203858.73740: checking for max_fail_percentage 15896 1727203858.73833: done checking for max_fail_percentage 15896 1727203858.73834: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.73835: done checking to see if all hosts have failed 15896 1727203858.73836: getting the remaining hosts for this loop 15896 1727203858.73838: done getting the remaining hosts for this loop 15896 1727203858.73842: getting the next task for host managed-node1 15896 1727203858.74078: done getting next task for host managed-node1 15896 1727203858.74083: ^ task is: TASK: Set network provider to 'nm' 15896 1727203858.74085: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.74089: getting variables 15896 1727203858.74091: in VariableManager get_vars() 15896 1727203858.74122: Calling all_inventory to load vars for managed-node1 15896 1727203858.74127: Calling groups_inventory to load vars for managed-node1 15896 1727203858.74130: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.74146: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.74152: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.74156: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.74782: WORKER PROCESS EXITING 15896 1727203858.74897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.75226: done with get_vars() 15896 1727203858.75235: done getting variables 15896 1727203858.75297: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:13 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.039) 0:00:04.342 ***** 15896 1727203858.75325: entering _queue_task() for managed-node1/set_fact 15896 1727203858.75652: worker is 1 (out of 1 available) 15896 1727203858.75667: exiting _queue_task() for managed-node1/set_fact 15896 1727203858.75682: done queuing things up, now waiting for results queue to drain 15896 1727203858.75686: waiting for pending results... 15896 1727203858.75999: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 15896 1727203858.76098: in run() - task 028d2410-947f-fb83-b6ad-000000000007 15896 1727203858.76118: variable 'ansible_search_path' from source: unknown 15896 1727203858.76157: calling self._execute() 15896 1727203858.76282: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.76286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.76295: variable 'omit' from source: magic vars 15896 1727203858.76411: variable 'omit' from source: magic vars 15896 1727203858.76447: variable 'omit' from source: magic vars 15896 1727203858.76488: variable 'omit' from source: magic vars 15896 1727203858.76534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203858.76574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203858.76603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203858.76680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203858.76684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203858.76687: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203858.76689: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.76692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.76798: Set connection var ansible_shell_type to sh 15896 1727203858.76812: Set connection var ansible_connection to ssh 15896 1727203858.76822: Set connection var ansible_shell_executable to /bin/sh 15896 1727203858.76837: Set connection var ansible_pipelining to False 15896 1727203858.76847: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203858.76856: Set connection var ansible_timeout to 10 15896 1727203858.76883: variable 'ansible_shell_executable' from source: unknown 15896 1727203858.76939: variable 'ansible_connection' from source: unknown 15896 1727203858.76942: variable 'ansible_module_compression' from source: unknown 15896 1727203858.76945: variable 'ansible_shell_type' from source: unknown 15896 1727203858.76947: variable 'ansible_shell_executable' from source: unknown 15896 1727203858.76950: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.76952: variable 'ansible_pipelining' from source: unknown 15896 1727203858.76954: variable 'ansible_timeout' from source: unknown 15896 1727203858.76956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.77114: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203858.77131: variable 'omit' from source: magic vars 15896 1727203858.77141: starting attempt loop 15896 1727203858.77152: running the handler 15896 1727203858.77385: handler run complete 15896 1727203858.77388: attempt loop complete, returning result 15896 1727203858.77390: _execute() done 15896 1727203858.77392: dumping result to json 15896 1727203858.77394: done dumping result, returning 15896 1727203858.77396: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [028d2410-947f-fb83-b6ad-000000000007] 15896 1727203858.77398: sending task result for task 028d2410-947f-fb83-b6ad-000000000007 15896 1727203858.77458: done sending task result for task 028d2410-947f-fb83-b6ad-000000000007 15896 1727203858.77461: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15896 1727203858.77528: no more pending results, returning what we have 15896 1727203858.77530: results queue empty 15896 1727203858.77531: checking for any_errors_fatal 15896 1727203858.77536: done checking for any_errors_fatal 15896 1727203858.77536: checking for max_fail_percentage 15896 1727203858.77538: done checking for max_fail_percentage 15896 1727203858.77538: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.77539: done checking to see if all hosts have failed 15896 1727203858.77540: getting the remaining hosts for this loop 15896 1727203858.77541: done getting the remaining hosts for this loop 15896 1727203858.77544: getting the next task for host managed-node1 15896 1727203858.77548: done getting next task for host managed-node1 15896 1727203858.77550: ^ task is: TASK: meta (flush_handlers) 15896 1727203858.77552: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.77556: getting variables 15896 1727203858.77557: in VariableManager get_vars() 15896 1727203858.77599: Calling all_inventory to load vars for managed-node1 15896 1727203858.77602: Calling groups_inventory to load vars for managed-node1 15896 1727203858.77605: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.77614: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.77617: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.77620: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.77817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.78109: done with get_vars() 15896 1727203858.78132: done getting variables 15896 1727203858.78217: in VariableManager get_vars() 15896 1727203858.78253: Calling all_inventory to load vars for managed-node1 15896 1727203858.78256: Calling groups_inventory to load vars for managed-node1 15896 1727203858.78259: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.78265: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.78269: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.78273: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.78447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.78744: done with get_vars() 15896 1727203858.78758: done queuing things up, now waiting for results queue to drain 15896 1727203858.78762: results queue empty 15896 1727203858.78763: checking for any_errors_fatal 15896 1727203858.78765: done checking for any_errors_fatal 15896 1727203858.78766: checking for max_fail_percentage 15896 1727203858.78767: done checking for max_fail_percentage 15896 1727203858.78768: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.78769: done checking to see if all hosts have failed 15896 1727203858.78769: getting the remaining hosts for this loop 15896 1727203858.78770: done getting the remaining hosts for this loop 15896 1727203858.78772: getting the next task for host managed-node1 15896 1727203858.78787: done getting next task for host managed-node1 15896 1727203858.78788: ^ task is: TASK: meta (flush_handlers) 15896 1727203858.78790: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.78803: getting variables 15896 1727203858.78805: in VariableManager get_vars() 15896 1727203858.78819: Calling all_inventory to load vars for managed-node1 15896 1727203858.78821: Calling groups_inventory to load vars for managed-node1 15896 1727203858.78823: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.78827: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.78829: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.78864: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.79116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.79321: done with get_vars() 15896 1727203858.79328: done getting variables 15896 1727203858.79368: in VariableManager get_vars() 15896 1727203858.79377: Calling all_inventory to load vars for managed-node1 15896 1727203858.79379: Calling groups_inventory to load vars for managed-node1 15896 1727203858.79381: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.79384: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.79386: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.79389: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.79534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.79977: done with get_vars() 15896 1727203858.79987: done queuing things up, now waiting for results queue to drain 15896 1727203858.79989: results queue empty 15896 1727203858.79990: checking for any_errors_fatal 15896 1727203858.79991: done checking for any_errors_fatal 15896 1727203858.79992: checking for max_fail_percentage 15896 1727203858.79993: done checking for max_fail_percentage 15896 1727203858.79993: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.79994: done checking to see if all hosts have failed 15896 1727203858.79995: getting the remaining hosts for this loop 15896 1727203858.79996: done getting the remaining hosts for this loop 15896 1727203858.79998: getting the next task for host managed-node1 15896 1727203858.80000: done getting next task for host managed-node1 15896 1727203858.80001: ^ task is: None 15896 1727203858.80003: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.80004: done queuing things up, now waiting for results queue to drain 15896 1727203858.80004: results queue empty 15896 1727203858.80005: checking for any_errors_fatal 15896 1727203858.80006: done checking for any_errors_fatal 15896 1727203858.80006: checking for max_fail_percentage 15896 1727203858.80007: done checking for max_fail_percentage 15896 1727203858.80008: checking to see if all hosts have failed and the running result is not ok 15896 1727203858.80009: done checking to see if all hosts have failed 15896 1727203858.80010: getting the next task for host managed-node1 15896 1727203858.80012: done getting next task for host managed-node1 15896 1727203858.80013: ^ task is: None 15896 1727203858.80014: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.80061: in VariableManager get_vars() 15896 1727203858.80095: done with get_vars() 15896 1727203858.80102: in VariableManager get_vars() 15896 1727203858.80122: done with get_vars() 15896 1727203858.80126: variable 'omit' from source: magic vars 15896 1727203858.80155: in VariableManager get_vars() 15896 1727203858.80175: done with get_vars() 15896 1727203858.80197: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 15896 1727203858.81396: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15896 1727203858.81420: getting the remaining hosts for this loop 15896 1727203858.81421: done getting the remaining hosts for this loop 15896 1727203858.81423: getting the next task for host managed-node1 15896 1727203858.81426: done getting next task for host managed-node1 15896 1727203858.81428: ^ task is: TASK: Gathering Facts 15896 1727203858.81429: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203858.81431: getting variables 15896 1727203858.81432: in VariableManager get_vars() 15896 1727203858.81448: Calling all_inventory to load vars for managed-node1 15896 1727203858.81450: Calling groups_inventory to load vars for managed-node1 15896 1727203858.81453: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203858.81479: Calling all_plugins_play to load vars for managed-node1 15896 1727203858.81494: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203858.81498: Calling groups_plugins_play to load vars for managed-node1 15896 1727203858.81628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203858.81899: done with get_vars() 15896 1727203858.81906: done getting variables 15896 1727203858.81964: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Tuesday 24 September 2024 14:50:58 -0400 (0:00:00.066) 0:00:04.409 ***** 15896 1727203858.81995: entering _queue_task() for managed-node1/gather_facts 15896 1727203858.82532: worker is 1 (out of 1 available) 15896 1727203858.82544: exiting _queue_task() for managed-node1/gather_facts 15896 1727203858.82553: done queuing things up, now waiting for results queue to drain 15896 1727203858.82554: waiting for pending results... 15896 1727203858.82903: running TaskExecutor() for managed-node1/TASK: Gathering Facts 15896 1727203858.83059: in run() - task 028d2410-947f-fb83-b6ad-000000000218 15896 1727203858.83063: variable 'ansible_search_path' from source: unknown 15896 1727203858.83133: calling self._execute() 15896 1727203858.83288: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.83292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.83296: variable 'omit' from source: magic vars 15896 1727203858.83865: variable 'ansible_distribution_major_version' from source: facts 15896 1727203858.83884: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203858.83924: variable 'omit' from source: magic vars 15896 1727203858.83954: variable 'omit' from source: magic vars 15896 1727203858.84028: variable 'omit' from source: magic vars 15896 1727203858.84039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203858.84096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203858.84132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203858.84253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203858.84256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203858.84320: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203858.84330: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.84338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.84528: Set connection var ansible_shell_type to sh 15896 1727203858.84580: Set connection var ansible_connection to ssh 15896 1727203858.84584: Set connection var ansible_shell_executable to /bin/sh 15896 1727203858.84586: Set connection var ansible_pipelining to False 15896 1727203858.84588: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203858.84591: Set connection var ansible_timeout to 10 15896 1727203858.84610: variable 'ansible_shell_executable' from source: unknown 15896 1727203858.84617: variable 'ansible_connection' from source: unknown 15896 1727203858.84624: variable 'ansible_module_compression' from source: unknown 15896 1727203858.84631: variable 'ansible_shell_type' from source: unknown 15896 1727203858.84692: variable 'ansible_shell_executable' from source: unknown 15896 1727203858.84695: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203858.84697: variable 'ansible_pipelining' from source: unknown 15896 1727203858.84700: variable 'ansible_timeout' from source: unknown 15896 1727203858.84702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203858.84910: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203858.84970: variable 'omit' from source: magic vars 15896 1727203858.84982: starting attempt loop 15896 1727203858.84986: running the handler 15896 1727203858.85019: variable 'ansible_facts' from source: unknown 15896 1727203858.85030: _low_level_execute_command(): starting 15896 1727203858.85128: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203858.86142: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203858.86148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203858.86290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203858.86378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203858.88372: stdout chunk (state=3): >>>/root <<< 15896 1727203858.88493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203858.88501: stdout chunk (state=3): >>><<< 15896 1727203858.88517: stderr chunk (state=3): >>><<< 15896 1727203858.88644: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203858.88648: _low_level_execute_command(): starting 15896 1727203858.88650: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228 `" && echo ansible-tmp-1727203858.885492-16283-34153333878228="` echo /root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228 `" ) && sleep 0' 15896 1727203858.89880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203858.90106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203858.90139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203858.90213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203858.90394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203858.92553: stdout chunk (state=3): >>>ansible-tmp-1727203858.885492-16283-34153333878228=/root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228 <<< 15896 1727203858.92843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203858.92867: stderr chunk (state=3): >>><<< 15896 1727203858.92871: stdout chunk (state=3): >>><<< 15896 1727203858.93283: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203858.885492-16283-34153333878228=/root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203858.93287: variable 'ansible_module_compression' from source: unknown 15896 1727203858.93290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15896 1727203858.93292: variable 'ansible_facts' from source: unknown 15896 1727203858.93550: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/AnsiballZ_setup.py 15896 1727203858.94357: Sending initial data 15896 1727203858.94360: Sent initial data (152 bytes) 15896 1727203858.95891: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203858.95963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203858.95994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203858.96044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203858.96219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203858.98712: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15896 1727203858.98755: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203858.98829: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203858.98952: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpcc95vn_a /root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/AnsiballZ_setup.py <<< 15896 1727203858.98972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/AnsiballZ_setup.py" <<< 15896 1727203858.99018: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpcc95vn_a" to remote "/root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/AnsiballZ_setup.py" <<< 15896 1727203859.01220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203859.01278: stderr chunk (state=3): >>><<< 15896 1727203859.01290: stdout chunk (state=3): >>><<< 15896 1727203859.01335: done transferring module to remote 15896 1727203859.01424: _low_level_execute_command(): starting 15896 1727203859.01427: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/ /root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/AnsiballZ_setup.py && sleep 0' 15896 1727203859.02465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203859.02484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203859.02615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203859.05192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203859.05229: stderr chunk (state=3): >>><<< 15896 1727203859.05237: stdout chunk (state=3): >>><<< 15896 1727203859.05253: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203859.05266: _low_level_execute_command(): starting 15896 1727203859.05278: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/AnsiballZ_setup.py && sleep 0' 15896 1727203859.06012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203859.06311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203859.06519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203859.87766: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "59", "epoch": "1727203859", "epoch_int": "1727203859", "date": "2024-09-24", "time": "14:50:59", "iso8601_micro": "2024-09-24T18:50:59.455297Z", "iso8601": "2024-09-24T18:50:59Z", "iso8601_basic": "20240924T145059455297", "iso8601_basic_short": "20240924T145059", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.66064453125, "5m": 0.34814453125, "15m": 0.162109375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2922, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 609, "free": 2922}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 450, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788094464, "block_size": 4096, "block_total": 65519099, "block_available": 63913109, "block_used": 1605990, "inode_total": 131070960, "inode_available": 131027262, "inode_used": 43698, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15896 1727203859.90539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203859.90543: stdout chunk (state=3): >>><<< 15896 1727203859.90545: stderr chunk (state=3): >>><<< 15896 1727203859.90549: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "59", "epoch": "1727203859", "epoch_int": "1727203859", "date": "2024-09-24", "time": "14:50:59", "iso8601_micro": "2024-09-24T18:50:59.455297Z", "iso8601": "2024-09-24T18:50:59Z", "iso8601_basic": "20240924T145059455297", "iso8601_basic_short": "20240924T145059", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.66064453125, "5m": 0.34814453125, "15m": 0.162109375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2922, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 609, "free": 2922}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 450, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261788094464, "block_size": 4096, "block_total": 65519099, "block_available": 63913109, "block_used": 1605990, "inode_total": 131070960, "inode_available": 131027262, "inode_used": 43698, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203859.91629: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203859.91633: _low_level_execute_command(): starting 15896 1727203859.91638: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203858.885492-16283-34153333878228/ > /dev/null 2>&1 && sleep 0' 15896 1727203859.93121: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203859.93196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203859.93430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203859.93449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203859.93467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203859.93496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203859.93792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203859.95698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203859.95822: stderr chunk (state=3): >>><<< 15896 1727203859.95832: stdout chunk (state=3): >>><<< 15896 1727203859.95852: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203859.96036: handler run complete 15896 1727203859.96189: variable 'ansible_facts' from source: unknown 15896 1727203859.96584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203859.97296: variable 'ansible_facts' from source: unknown 15896 1727203859.97387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203859.97733: attempt loop complete, returning result 15896 1727203859.97811: _execute() done 15896 1727203859.97819: dumping result to json 15896 1727203859.97893: done dumping result, returning 15896 1727203859.97955: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-fb83-b6ad-000000000218] 15896 1727203859.97967: sending task result for task 028d2410-947f-fb83-b6ad-000000000218 15896 1727203859.99504: done sending task result for task 028d2410-947f-fb83-b6ad-000000000218 15896 1727203859.99507: WORKER PROCESS EXITING ok: [managed-node1] 15896 1727203860.00084: no more pending results, returning what we have 15896 1727203860.00087: results queue empty 15896 1727203860.00088: checking for any_errors_fatal 15896 1727203860.00090: done checking for any_errors_fatal 15896 1727203860.00090: checking for max_fail_percentage 15896 1727203860.00092: done checking for max_fail_percentage 15896 1727203860.00093: checking to see if all hosts have failed and the running result is not ok 15896 1727203860.00094: done checking to see if all hosts have failed 15896 1727203860.00094: getting the remaining hosts for this loop 15896 1727203860.00096: done getting the remaining hosts for this loop 15896 1727203860.00099: getting the next task for host managed-node1 15896 1727203860.00104: done getting next task for host managed-node1 15896 1727203860.00106: ^ task is: TASK: meta (flush_handlers) 15896 1727203860.00108: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203860.00111: getting variables 15896 1727203860.00180: in VariableManager get_vars() 15896 1727203860.00336: Calling all_inventory to load vars for managed-node1 15896 1727203860.00340: Calling groups_inventory to load vars for managed-node1 15896 1727203860.00343: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203860.00354: Calling all_plugins_play to load vars for managed-node1 15896 1727203860.00356: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203860.00361: Calling groups_plugins_play to load vars for managed-node1 15896 1727203860.00906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203860.01588: done with get_vars() 15896 1727203860.01715: done getting variables 15896 1727203860.01799: in VariableManager get_vars() 15896 1727203860.02083: Calling all_inventory to load vars for managed-node1 15896 1727203860.02085: Calling groups_inventory to load vars for managed-node1 15896 1727203860.02088: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203860.02093: Calling all_plugins_play to load vars for managed-node1 15896 1727203860.02095: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203860.02098: Calling groups_plugins_play to load vars for managed-node1 15896 1727203860.02591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203860.03516: done with get_vars() 15896 1727203860.03533: done queuing things up, now waiting for results queue to drain 15896 1727203860.03535: results queue empty 15896 1727203860.03536: checking for any_errors_fatal 15896 1727203860.03540: done checking for any_errors_fatal 15896 1727203860.03541: checking for max_fail_percentage 15896 1727203860.03564: done checking for max_fail_percentage 15896 1727203860.03569: checking to see if all hosts have failed and the running result is not ok 15896 1727203860.03570: done checking to see if all hosts have failed 15896 1727203860.03571: getting the remaining hosts for this loop 15896 1727203860.03572: done getting the remaining hosts for this loop 15896 1727203860.03585: getting the next task for host managed-node1 15896 1727203860.03589: done getting next task for host managed-node1 15896 1727203860.03591: ^ task is: TASK: INIT Prepare setup 15896 1727203860.03593: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203860.03595: getting variables 15896 1727203860.03596: in VariableManager get_vars() 15896 1727203860.03674: Calling all_inventory to load vars for managed-node1 15896 1727203860.03765: Calling groups_inventory to load vars for managed-node1 15896 1727203860.03768: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203860.03773: Calling all_plugins_play to load vars for managed-node1 15896 1727203860.03777: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203860.03780: Calling groups_plugins_play to load vars for managed-node1 15896 1727203860.04091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203860.04749: done with get_vars() 15896 1727203860.04757: done getting variables 15896 1727203860.05381: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Tuesday 24 September 2024 14:51:00 -0400 (0:00:01.234) 0:00:05.643 ***** 15896 1727203860.05409: entering _queue_task() for managed-node1/debug 15896 1727203860.05411: Creating lock for debug 15896 1727203860.06980: worker is 1 (out of 1 available) 15896 1727203860.06995: exiting _queue_task() for managed-node1/debug 15896 1727203860.07006: done queuing things up, now waiting for results queue to drain 15896 1727203860.07008: waiting for pending results... 15896 1727203860.07600: running TaskExecutor() for managed-node1/TASK: INIT Prepare setup 15896 1727203860.07606: in run() - task 028d2410-947f-fb83-b6ad-00000000000b 15896 1727203860.08036: variable 'ansible_search_path' from source: unknown 15896 1727203860.08040: calling self._execute() 15896 1727203860.08223: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203860.08263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203860.08583: variable 'omit' from source: magic vars 15896 1727203860.09685: variable 'ansible_distribution_major_version' from source: facts 15896 1727203860.09702: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203860.09760: variable 'omit' from source: magic vars 15896 1727203860.10084: variable 'omit' from source: magic vars 15896 1727203860.10088: variable 'omit' from source: magic vars 15896 1727203860.10090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203860.10409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203860.10412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203860.10415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203860.10422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203860.10735: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203860.10739: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203860.10741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203860.10923: Set connection var ansible_shell_type to sh 15896 1727203860.10937: Set connection var ansible_connection to ssh 15896 1727203860.10966: Set connection var ansible_shell_executable to /bin/sh 15896 1727203860.11285: Set connection var ansible_pipelining to False 15896 1727203860.11288: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203860.11292: Set connection var ansible_timeout to 10 15896 1727203860.11295: variable 'ansible_shell_executable' from source: unknown 15896 1727203860.11297: variable 'ansible_connection' from source: unknown 15896 1727203860.11299: variable 'ansible_module_compression' from source: unknown 15896 1727203860.11301: variable 'ansible_shell_type' from source: unknown 15896 1727203860.11303: variable 'ansible_shell_executable' from source: unknown 15896 1727203860.11305: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203860.11307: variable 'ansible_pipelining' from source: unknown 15896 1727203860.11309: variable 'ansible_timeout' from source: unknown 15896 1727203860.11311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203860.11545: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203860.11563: variable 'omit' from source: magic vars 15896 1727203860.11722: starting attempt loop 15896 1727203860.11726: running the handler 15896 1727203860.11810: handler run complete 15896 1727203860.12083: attempt loop complete, returning result 15896 1727203860.12086: _execute() done 15896 1727203860.12089: dumping result to json 15896 1727203860.12091: done dumping result, returning 15896 1727203860.12093: done running TaskExecutor() for managed-node1/TASK: INIT Prepare setup [028d2410-947f-fb83-b6ad-00000000000b] 15896 1727203860.12096: sending task result for task 028d2410-947f-fb83-b6ad-00000000000b 15896 1727203860.12373: done sending task result for task 028d2410-947f-fb83-b6ad-00000000000b 15896 1727203860.12378: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: ################################################## 15896 1727203860.12437: no more pending results, returning what we have 15896 1727203860.12440: results queue empty 15896 1727203860.12441: checking for any_errors_fatal 15896 1727203860.12444: done checking for any_errors_fatal 15896 1727203860.12445: checking for max_fail_percentage 15896 1727203860.12446: done checking for max_fail_percentage 15896 1727203860.12447: checking to see if all hosts have failed and the running result is not ok 15896 1727203860.12448: done checking to see if all hosts have failed 15896 1727203860.12448: getting the remaining hosts for this loop 15896 1727203860.12451: done getting the remaining hosts for this loop 15896 1727203860.12455: getting the next task for host managed-node1 15896 1727203860.12461: done getting next task for host managed-node1 15896 1727203860.12465: ^ task is: TASK: Install dnsmasq 15896 1727203860.12469: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203860.12474: getting variables 15896 1727203860.12478: in VariableManager get_vars() 15896 1727203860.12534: Calling all_inventory to load vars for managed-node1 15896 1727203860.12536: Calling groups_inventory to load vars for managed-node1 15896 1727203860.12538: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203860.12549: Calling all_plugins_play to load vars for managed-node1 15896 1727203860.12551: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203860.12553: Calling groups_plugins_play to load vars for managed-node1 15896 1727203860.13369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203860.14214: done with get_vars() 15896 1727203860.14226: done getting variables 15896 1727203860.14455: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:51:00 -0400 (0:00:00.092) 0:00:05.735 ***** 15896 1727203860.14634: entering _queue_task() for managed-node1/package 15896 1727203860.15440: worker is 1 (out of 1 available) 15896 1727203860.15454: exiting _queue_task() for managed-node1/package 15896 1727203860.15470: done queuing things up, now waiting for results queue to drain 15896 1727203860.15472: waiting for pending results... 15896 1727203860.16079: running TaskExecutor() for managed-node1/TASK: Install dnsmasq 15896 1727203860.16131: in run() - task 028d2410-947f-fb83-b6ad-00000000000f 15896 1727203860.16584: variable 'ansible_search_path' from source: unknown 15896 1727203860.16587: variable 'ansible_search_path' from source: unknown 15896 1727203860.16590: calling self._execute() 15896 1727203860.16783: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203860.16786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203860.16788: variable 'omit' from source: magic vars 15896 1727203860.17664: variable 'ansible_distribution_major_version' from source: facts 15896 1727203860.17684: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203860.17697: variable 'omit' from source: magic vars 15896 1727203860.17821: variable 'omit' from source: magic vars 15896 1727203860.18425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203860.20714: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203860.20960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203860.20964: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203860.21059: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203860.21254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203860.21333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203860.21423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203860.21506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203860.21553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203860.21627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203860.21884: variable '__network_is_ostree' from source: set_fact 15896 1727203860.21895: variable 'omit' from source: magic vars 15896 1727203860.21935: variable 'omit' from source: magic vars 15896 1727203860.22157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203860.22160: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203860.22163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203860.22166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203860.22267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203860.22308: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203860.22317: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203860.22328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203860.22591: Set connection var ansible_shell_type to sh 15896 1727203860.22596: Set connection var ansible_connection to ssh 15896 1727203860.22598: Set connection var ansible_shell_executable to /bin/sh 15896 1727203860.22600: Set connection var ansible_pipelining to False 15896 1727203860.22602: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203860.22604: Set connection var ansible_timeout to 10 15896 1727203860.22707: variable 'ansible_shell_executable' from source: unknown 15896 1727203860.22719: variable 'ansible_connection' from source: unknown 15896 1727203860.22728: variable 'ansible_module_compression' from source: unknown 15896 1727203860.22734: variable 'ansible_shell_type' from source: unknown 15896 1727203860.22740: variable 'ansible_shell_executable' from source: unknown 15896 1727203860.22746: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203860.22918: variable 'ansible_pipelining' from source: unknown 15896 1727203860.22922: variable 'ansible_timeout' from source: unknown 15896 1727203860.22924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203860.23006: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203860.23027: variable 'omit' from source: magic vars 15896 1727203860.23039: starting attempt loop 15896 1727203860.23050: running the handler 15896 1727203860.23063: variable 'ansible_facts' from source: unknown 15896 1727203860.23069: variable 'ansible_facts' from source: unknown 15896 1727203860.23138: _low_level_execute_command(): starting 15896 1727203860.23142: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203860.23918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203860.23936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203860.24030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203860.24068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203860.24096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203860.24123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203860.24355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203860.26042: stdout chunk (state=3): >>>/root <<< 15896 1727203860.26205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203860.26247: stderr chunk (state=3): >>><<< 15896 1727203860.26258: stdout chunk (state=3): >>><<< 15896 1727203860.26382: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203860.26462: _low_level_execute_command(): starting 15896 1727203860.26466: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648 `" && echo ansible-tmp-1727203860.263693-16414-107803580771648="` echo /root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648 `" ) && sleep 0' 15896 1727203860.27725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203860.27746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203860.27759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203860.27769: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203860.27941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203860.28195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203860.28398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203860.28497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203860.30614: stdout chunk (state=3): >>>ansible-tmp-1727203860.263693-16414-107803580771648=/root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648 <<< 15896 1727203860.30762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203860.30766: stdout chunk (state=3): >>><<< 15896 1727203860.30768: stderr chunk (state=3): >>><<< 15896 1727203860.30984: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203860.263693-16414-107803580771648=/root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203860.30987: variable 'ansible_module_compression' from source: unknown 15896 1727203860.30991: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 15896 1727203860.30994: ANSIBALLZ: Acquiring lock 15896 1727203860.30996: ANSIBALLZ: Lock acquired: 140082272719056 15896 1727203860.30997: ANSIBALLZ: Creating module 15896 1727203860.55299: ANSIBALLZ: Writing module into payload 15896 1727203860.55509: ANSIBALLZ: Writing module 15896 1727203860.55533: ANSIBALLZ: Renaming module 15896 1727203860.55542: ANSIBALLZ: Done creating module 15896 1727203860.55561: variable 'ansible_facts' from source: unknown 15896 1727203860.55682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/AnsiballZ_dnf.py 15896 1727203860.55824: Sending initial data 15896 1727203860.55887: Sent initial data (151 bytes) 15896 1727203860.56505: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203860.56519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203860.56533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203860.56549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203860.56572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203860.56587: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203860.56691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203860.56703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203860.56820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203860.58577: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15896 1727203860.58603: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203860.58701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203860.58791: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp3pn277mu /root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/AnsiballZ_dnf.py <<< 15896 1727203860.58801: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/AnsiballZ_dnf.py" <<< 15896 1727203860.58863: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp3pn277mu" to remote "/root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/AnsiballZ_dnf.py" <<< 15896 1727203860.60072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203860.60127: stderr chunk (state=3): >>><<< 15896 1727203860.60181: stdout chunk (state=3): >>><<< 15896 1727203860.60185: done transferring module to remote 15896 1727203860.60187: _low_level_execute_command(): starting 15896 1727203860.60194: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/ /root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/AnsiballZ_dnf.py && sleep 0' 15896 1727203860.60878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203860.60894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203860.60910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203860.60989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203860.61036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203860.61053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203860.61082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203860.61213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203860.63239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203860.63242: stdout chunk (state=3): >>><<< 15896 1727203860.63245: stderr chunk (state=3): >>><<< 15896 1727203860.63258: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203860.63339: _low_level_execute_command(): starting 15896 1727203860.63342: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/AnsiballZ_dnf.py && sleep 0' 15896 1727203860.63921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203860.63944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203860.64066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203861.10402: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 15896 1727203861.15637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203861.15642: stderr chunk (state=3): >>><<< 15896 1727203861.15644: stdout chunk (state=3): >>><<< 15896 1727203861.15951: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203861.15964: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203861.15968: _low_level_execute_command(): starting 15896 1727203861.15970: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203860.263693-16414-107803580771648/ > /dev/null 2>&1 && sleep 0' 15896 1727203861.17304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203861.17307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203861.17310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203861.17312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203861.17389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203861.17929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203861.18133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203861.20126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203861.20156: stderr chunk (state=3): >>><<< 15896 1727203861.20296: stdout chunk (state=3): >>><<< 15896 1727203861.20299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203861.20517: handler run complete 15896 1727203861.20795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203861.21202: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203861.21384: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203861.21389: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203861.21510: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203861.21827: variable '__install_status' from source: unknown 15896 1727203861.21830: Evaluated conditional (__install_status is success): True 15896 1727203861.21832: attempt loop complete, returning result 15896 1727203861.21834: _execute() done 15896 1727203861.21836: dumping result to json 15896 1727203861.21838: done dumping result, returning 15896 1727203861.21840: done running TaskExecutor() for managed-node1/TASK: Install dnsmasq [028d2410-947f-fb83-b6ad-00000000000f] 15896 1727203861.21841: sending task result for task 028d2410-947f-fb83-b6ad-00000000000f 15896 1727203861.22340: done sending task result for task 028d2410-947f-fb83-b6ad-00000000000f 15896 1727203861.22343: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 15896 1727203861.22506: no more pending results, returning what we have 15896 1727203861.22509: results queue empty 15896 1727203861.22510: checking for any_errors_fatal 15896 1727203861.22590: done checking for any_errors_fatal 15896 1727203861.22592: checking for max_fail_percentage 15896 1727203861.22593: done checking for max_fail_percentage 15896 1727203861.22594: checking to see if all hosts have failed and the running result is not ok 15896 1727203861.22595: done checking to see if all hosts have failed 15896 1727203861.22596: getting the remaining hosts for this loop 15896 1727203861.22597: done getting the remaining hosts for this loop 15896 1727203861.22601: getting the next task for host managed-node1 15896 1727203861.22606: done getting next task for host managed-node1 15896 1727203861.22609: ^ task is: TASK: Install pgrep, sysctl 15896 1727203861.22612: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203861.22616: getting variables 15896 1727203861.22617: in VariableManager get_vars() 15896 1727203861.22806: Calling all_inventory to load vars for managed-node1 15896 1727203861.22809: Calling groups_inventory to load vars for managed-node1 15896 1727203861.22811: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203861.22820: Calling all_plugins_play to load vars for managed-node1 15896 1727203861.22823: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203861.22826: Calling groups_plugins_play to load vars for managed-node1 15896 1727203861.23271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203861.23471: done with get_vars() 15896 1727203861.23484: done getting variables 15896 1727203861.23541: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:51:01 -0400 (0:00:01.089) 0:00:06.825 ***** 15896 1727203861.23577: entering _queue_task() for managed-node1/package 15896 1727203861.23886: worker is 1 (out of 1 available) 15896 1727203861.23898: exiting _queue_task() for managed-node1/package 15896 1727203861.23911: done queuing things up, now waiting for results queue to drain 15896 1727203861.23913: waiting for pending results... 15896 1727203861.24457: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 15896 1727203861.24606: in run() - task 028d2410-947f-fb83-b6ad-000000000010 15896 1727203861.24625: variable 'ansible_search_path' from source: unknown 15896 1727203861.24685: variable 'ansible_search_path' from source: unknown 15896 1727203861.24690: calling self._execute() 15896 1727203861.24787: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203861.24981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203861.24984: variable 'omit' from source: magic vars 15896 1727203861.25223: variable 'ansible_distribution_major_version' from source: facts 15896 1727203861.25241: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203861.25490: variable 'ansible_os_family' from source: facts 15896 1727203861.25545: Evaluated conditional (ansible_os_family == 'RedHat'): True 15896 1727203861.25884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203861.26157: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203861.26214: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203861.26255: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203861.26339: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203861.26432: variable 'ansible_distribution_major_version' from source: facts 15896 1727203861.26449: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 15896 1727203861.26456: when evaluation is False, skipping this task 15896 1727203861.26464: _execute() done 15896 1727203861.26471: dumping result to json 15896 1727203861.26481: done dumping result, returning 15896 1727203861.26491: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [028d2410-947f-fb83-b6ad-000000000010] 15896 1727203861.26500: sending task result for task 028d2410-947f-fb83-b6ad-000000000010 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 15896 1727203861.26671: no more pending results, returning what we have 15896 1727203861.26677: results queue empty 15896 1727203861.26678: checking for any_errors_fatal 15896 1727203861.26687: done checking for any_errors_fatal 15896 1727203861.26688: checking for max_fail_percentage 15896 1727203861.26690: done checking for max_fail_percentage 15896 1727203861.26691: checking to see if all hosts have failed and the running result is not ok 15896 1727203861.26692: done checking to see if all hosts have failed 15896 1727203861.26692: getting the remaining hosts for this loop 15896 1727203861.26694: done getting the remaining hosts for this loop 15896 1727203861.26698: getting the next task for host managed-node1 15896 1727203861.26705: done getting next task for host managed-node1 15896 1727203861.26707: ^ task is: TASK: Install pgrep, sysctl 15896 1727203861.26711: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203861.26714: getting variables 15896 1727203861.26716: in VariableManager get_vars() 15896 1727203861.26774: Calling all_inventory to load vars for managed-node1 15896 1727203861.26984: Calling groups_inventory to load vars for managed-node1 15896 1727203861.26987: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203861.26997: Calling all_plugins_play to load vars for managed-node1 15896 1727203861.27000: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203861.27003: Calling groups_plugins_play to load vars for managed-node1 15896 1727203861.27172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203861.27463: done with get_vars() 15896 1727203861.27485: done getting variables 15896 1727203861.27555: done sending task result for task 028d2410-947f-fb83-b6ad-000000000010 15896 1727203861.27559: WORKER PROCESS EXITING 15896 1727203861.27597: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:51:01 -0400 (0:00:00.040) 0:00:06.865 ***** 15896 1727203861.27627: entering _queue_task() for managed-node1/package 15896 1727203861.27924: worker is 1 (out of 1 available) 15896 1727203861.27937: exiting _queue_task() for managed-node1/package 15896 1727203861.27948: done queuing things up, now waiting for results queue to drain 15896 1727203861.27950: waiting for pending results... 15896 1727203861.28203: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 15896 1727203861.28322: in run() - task 028d2410-947f-fb83-b6ad-000000000011 15896 1727203861.28341: variable 'ansible_search_path' from source: unknown 15896 1727203861.28347: variable 'ansible_search_path' from source: unknown 15896 1727203861.28384: calling self._execute() 15896 1727203861.28473: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203861.28487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203861.28501: variable 'omit' from source: magic vars 15896 1727203861.28931: variable 'ansible_distribution_major_version' from source: facts 15896 1727203861.28954: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203861.29073: variable 'ansible_os_family' from source: facts 15896 1727203861.29088: Evaluated conditional (ansible_os_family == 'RedHat'): True 15896 1727203861.29310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203861.29574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203861.29628: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203861.29666: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203861.29733: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203861.29928: variable 'ansible_distribution_major_version' from source: facts 15896 1727203861.29994: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 15896 1727203861.30006: variable 'omit' from source: magic vars 15896 1727203861.30060: variable 'omit' from source: magic vars 15896 1727203861.30436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203861.32753: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203861.32828: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203861.32879: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203861.32917: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203861.32954: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203861.33059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203861.33100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203861.33131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203861.33182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203861.33201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203861.33314: variable '__network_is_ostree' from source: set_fact 15896 1727203861.33326: variable 'omit' from source: magic vars 15896 1727203861.33362: variable 'omit' from source: magic vars 15896 1727203861.33400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203861.33431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203861.33456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203861.33480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203861.33499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203861.33607: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203861.33610: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203861.33613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203861.33654: Set connection var ansible_shell_type to sh 15896 1727203861.33669: Set connection var ansible_connection to ssh 15896 1727203861.33682: Set connection var ansible_shell_executable to /bin/sh 15896 1727203861.33693: Set connection var ansible_pipelining to False 15896 1727203861.33703: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203861.33718: Set connection var ansible_timeout to 10 15896 1727203861.33747: variable 'ansible_shell_executable' from source: unknown 15896 1727203861.33756: variable 'ansible_connection' from source: unknown 15896 1727203861.33763: variable 'ansible_module_compression' from source: unknown 15896 1727203861.33770: variable 'ansible_shell_type' from source: unknown 15896 1727203861.33779: variable 'ansible_shell_executable' from source: unknown 15896 1727203861.33786: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203861.33794: variable 'ansible_pipelining' from source: unknown 15896 1727203861.33801: variable 'ansible_timeout' from source: unknown 15896 1727203861.33809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203861.33934: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203861.33937: variable 'omit' from source: magic vars 15896 1727203861.33941: starting attempt loop 15896 1727203861.33950: running the handler 15896 1727203861.34042: variable 'ansible_facts' from source: unknown 15896 1727203861.34045: variable 'ansible_facts' from source: unknown 15896 1727203861.34047: _low_level_execute_command(): starting 15896 1727203861.34049: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203861.34719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203861.34731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203861.34743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203861.34761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203861.34797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203861.34809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203861.34891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203861.34925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203861.34938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203861.35050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203861.36844: stdout chunk (state=3): >>>/root <<< 15896 1727203861.36988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203861.37000: stdout chunk (state=3): >>><<< 15896 1727203861.37013: stderr chunk (state=3): >>><<< 15896 1727203861.37144: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203861.37147: _low_level_execute_command(): starting 15896 1727203861.37151: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029 `" && echo ansible-tmp-1727203861.370479-16577-139974063846029="` echo /root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029 `" ) && sleep 0' 15896 1727203861.37798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203861.37802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203861.37821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203861.37903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203861.37931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203861.38047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203861.40155: stdout chunk (state=3): >>>ansible-tmp-1727203861.370479-16577-139974063846029=/root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029 <<< 15896 1727203861.40345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203861.40349: stdout chunk (state=3): >>><<< 15896 1727203861.40351: stderr chunk (state=3): >>><<< 15896 1727203861.40582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203861.370479-16577-139974063846029=/root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203861.40586: variable 'ansible_module_compression' from source: unknown 15896 1727203861.40589: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 15896 1727203861.40591: variable 'ansible_facts' from source: unknown 15896 1727203861.40646: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/AnsiballZ_dnf.py 15896 1727203861.40801: Sending initial data 15896 1727203861.40891: Sent initial data (151 bytes) 15896 1727203861.41568: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203861.41700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203861.41726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203861.41842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203861.43569: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15896 1727203861.43615: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203861.43746: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203861.43843: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmprnbeoqc_ /root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/AnsiballZ_dnf.py <<< 15896 1727203861.43864: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/AnsiballZ_dnf.py" <<< 15896 1727203861.43980: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmprnbeoqc_" to remote "/root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/AnsiballZ_dnf.py" <<< 15896 1727203861.45139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203861.45225: stderr chunk (state=3): >>><<< 15896 1727203861.45351: stdout chunk (state=3): >>><<< 15896 1727203861.45355: done transferring module to remote 15896 1727203861.45358: _low_level_execute_command(): starting 15896 1727203861.45361: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/ /root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/AnsiballZ_dnf.py && sleep 0' 15896 1727203861.46039: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203861.46116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203861.46166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203861.46169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203861.46284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203861.48290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203861.48357: stderr chunk (state=3): >>><<< 15896 1727203861.48363: stdout chunk (state=3): >>><<< 15896 1727203861.48465: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203861.48471: _low_level_execute_command(): starting 15896 1727203861.48677: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/AnsiballZ_dnf.py && sleep 0' 15896 1727203861.49347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203861.49362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203861.49377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203861.49395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203861.49409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203861.49419: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203861.49435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203861.49453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203861.49465: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203861.49489: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203861.49613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203861.49640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203861.49780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203861.96246: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 15896 1727203862.02231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203862.02235: stdout chunk (state=3): >>><<< 15896 1727203862.02238: stderr chunk (state=3): >>><<< 15896 1727203862.02302: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203862.02372: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203862.02380: _low_level_execute_command(): starting 15896 1727203862.02385: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203861.370479-16577-139974063846029/ > /dev/null 2>&1 && sleep 0' 15896 1727203862.03860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203862.03923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203862.04089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203862.04112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203862.04137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203862.04259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203862.07090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203862.07094: stdout chunk (state=3): >>><<< 15896 1727203862.07097: stderr chunk (state=3): >>><<< 15896 1727203862.07099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203862.07102: handler run complete 15896 1727203862.07104: attempt loop complete, returning result 15896 1727203862.07106: _execute() done 15896 1727203862.07108: dumping result to json 15896 1727203862.07110: done dumping result, returning 15896 1727203862.07112: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [028d2410-947f-fb83-b6ad-000000000011] 15896 1727203862.07114: sending task result for task 028d2410-947f-fb83-b6ad-000000000011 15896 1727203862.07199: done sending task result for task 028d2410-947f-fb83-b6ad-000000000011 15896 1727203862.07204: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 15896 1727203862.07301: no more pending results, returning what we have 15896 1727203862.07305: results queue empty 15896 1727203862.07306: checking for any_errors_fatal 15896 1727203862.07314: done checking for any_errors_fatal 15896 1727203862.07314: checking for max_fail_percentage 15896 1727203862.07316: done checking for max_fail_percentage 15896 1727203862.07317: checking to see if all hosts have failed and the running result is not ok 15896 1727203862.07317: done checking to see if all hosts have failed 15896 1727203862.07318: getting the remaining hosts for this loop 15896 1727203862.07319: done getting the remaining hosts for this loop 15896 1727203862.07323: getting the next task for host managed-node1 15896 1727203862.07334: done getting next task for host managed-node1 15896 1727203862.07337: ^ task is: TASK: Create test interfaces 15896 1727203862.07340: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203862.07344: getting variables 15896 1727203862.07345: in VariableManager get_vars() 15896 1727203862.07906: Calling all_inventory to load vars for managed-node1 15896 1727203862.07910: Calling groups_inventory to load vars for managed-node1 15896 1727203862.07914: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203862.08132: Calling all_plugins_play to load vars for managed-node1 15896 1727203862.08137: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203862.08140: Calling groups_plugins_play to load vars for managed-node1 15896 1727203862.08585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203862.08889: done with get_vars() 15896 1727203862.08905: done getting variables 15896 1727203862.09071: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:51:02 -0400 (0:00:00.814) 0:00:07.680 ***** 15896 1727203862.09107: entering _queue_task() for managed-node1/shell 15896 1727203862.09109: Creating lock for shell 15896 1727203862.09614: worker is 1 (out of 1 available) 15896 1727203862.09624: exiting _queue_task() for managed-node1/shell 15896 1727203862.09635: done queuing things up, now waiting for results queue to drain 15896 1727203862.09636: waiting for pending results... 15896 1727203862.09875: running TaskExecutor() for managed-node1/TASK: Create test interfaces 15896 1727203862.10216: in run() - task 028d2410-947f-fb83-b6ad-000000000012 15896 1727203862.10403: variable 'ansible_search_path' from source: unknown 15896 1727203862.10407: variable 'ansible_search_path' from source: unknown 15896 1727203862.10410: calling self._execute() 15896 1727203862.10548: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203862.10591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203862.10633: variable 'omit' from source: magic vars 15896 1727203862.11469: variable 'ansible_distribution_major_version' from source: facts 15896 1727203862.11489: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203862.11564: variable 'omit' from source: magic vars 15896 1727203862.11822: variable 'omit' from source: magic vars 15896 1727203862.13021: variable 'dhcp_interface1' from source: play vars 15896 1727203862.13156: variable 'dhcp_interface2' from source: play vars 15896 1727203862.13253: variable 'omit' from source: magic vars 15896 1727203862.13471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203862.13889: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203862.13894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203862.13896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203862.13898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203862.13900: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203862.13902: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203862.14086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203862.14181: Set connection var ansible_shell_type to sh 15896 1727203862.14564: Set connection var ansible_connection to ssh 15896 1727203862.14568: Set connection var ansible_shell_executable to /bin/sh 15896 1727203862.14570: Set connection var ansible_pipelining to False 15896 1727203862.14573: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203862.14577: Set connection var ansible_timeout to 10 15896 1727203862.14586: variable 'ansible_shell_executable' from source: unknown 15896 1727203862.14588: variable 'ansible_connection' from source: unknown 15896 1727203862.14590: variable 'ansible_module_compression' from source: unknown 15896 1727203862.14591: variable 'ansible_shell_type' from source: unknown 15896 1727203862.14593: variable 'ansible_shell_executable' from source: unknown 15896 1727203862.14594: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203862.14596: variable 'ansible_pipelining' from source: unknown 15896 1727203862.14604: variable 'ansible_timeout' from source: unknown 15896 1727203862.14606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203862.15285: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203862.15289: variable 'omit' from source: magic vars 15896 1727203862.15291: starting attempt loop 15896 1727203862.15293: running the handler 15896 1727203862.15298: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203862.15300: _low_level_execute_command(): starting 15896 1727203862.15302: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203862.16962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203862.17105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203862.17247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203862.17293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203862.19209: stdout chunk (state=3): >>>/root <<< 15896 1727203862.19456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203862.19461: stdout chunk (state=3): >>><<< 15896 1727203862.19464: stderr chunk (state=3): >>><<< 15896 1727203862.19489: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203862.19516: _low_level_execute_command(): starting 15896 1727203862.19530: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461 `" && echo ansible-tmp-1727203862.1949694-16632-269987482362461="` echo /root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461 `" ) && sleep 0' 15896 1727203862.20403: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203862.20444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203862.20566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203862.20571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203862.20589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203862.20683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203862.22856: stdout chunk (state=3): >>>ansible-tmp-1727203862.1949694-16632-269987482362461=/root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461 <<< 15896 1727203862.23277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203862.23282: stdout chunk (state=3): >>><<< 15896 1727203862.23285: stderr chunk (state=3): >>><<< 15896 1727203862.23288: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203862.1949694-16632-269987482362461=/root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203862.23290: variable 'ansible_module_compression' from source: unknown 15896 1727203862.23684: ANSIBALLZ: Using generic lock for ansible.legacy.command 15896 1727203862.23687: ANSIBALLZ: Acquiring lock 15896 1727203862.23689: ANSIBALLZ: Lock acquired: 140082272719056 15896 1727203862.23691: ANSIBALLZ: Creating module 15896 1727203862.43329: ANSIBALLZ: Writing module into payload 15896 1727203862.43490: ANSIBALLZ: Writing module 15896 1727203862.43494: ANSIBALLZ: Renaming module 15896 1727203862.43496: ANSIBALLZ: Done creating module 15896 1727203862.43498: variable 'ansible_facts' from source: unknown 15896 1727203862.43557: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/AnsiballZ_command.py 15896 1727203862.43742: Sending initial data 15896 1727203862.43746: Sent initial data (156 bytes) 15896 1727203862.44382: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203862.44386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203862.44398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203862.44418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203862.44492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203862.44526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203862.44555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203862.44617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203862.44705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203862.46552: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203862.46648: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203862.46739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpw9zeobas /root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/AnsiballZ_command.py <<< 15896 1727203862.46742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/AnsiballZ_command.py" <<< 15896 1727203862.46921: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpw9zeobas" to remote "/root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/AnsiballZ_command.py" <<< 15896 1727203862.48401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203862.48436: stderr chunk (state=3): >>><<< 15896 1727203862.48439: stdout chunk (state=3): >>><<< 15896 1727203862.48458: done transferring module to remote 15896 1727203862.48640: _low_level_execute_command(): starting 15896 1727203862.48645: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/ /root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/AnsiballZ_command.py && sleep 0' 15896 1727203862.49901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203862.50078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203862.50083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203862.50124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203862.52074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203862.52387: stderr chunk (state=3): >>><<< 15896 1727203862.52391: stdout chunk (state=3): >>><<< 15896 1727203862.52394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203862.52397: _low_level_execute_command(): starting 15896 1727203862.52399: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/AnsiballZ_command.py && sleep 0' 15896 1727203862.53674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203862.53724: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203862.53728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203862.53730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203862.53732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203862.53876: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203862.54056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203862.54193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203863.94491: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:51:02.703368", "end": "2024-09-24 14:51:03.942824", "delta": "0:00:01.239456", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203863.96483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203863.96487: stdout chunk (state=3): >>><<< 15896 1727203863.96489: stderr chunk (state=3): >>><<< 15896 1727203863.96492: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:51:02.703368", "end": "2024-09-24 14:51:03.942824", "delta": "0:00:01.239456", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203863.96500: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203863.96502: _low_level_execute_command(): starting 15896 1727203863.96504: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203862.1949694-16632-269987482362461/ > /dev/null 2>&1 && sleep 0' 15896 1727203863.97096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203863.97111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203863.97127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203863.97187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203863.97246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203863.97290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203863.97403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203863.99684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203863.99688: stdout chunk (state=3): >>><<< 15896 1727203863.99690: stderr chunk (state=3): >>><<< 15896 1727203863.99693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203863.99695: handler run complete 15896 1727203863.99697: Evaluated conditional (False): False 15896 1727203863.99699: attempt loop complete, returning result 15896 1727203863.99700: _execute() done 15896 1727203863.99702: dumping result to json 15896 1727203863.99704: done dumping result, returning 15896 1727203863.99706: done running TaskExecutor() for managed-node1/TASK: Create test interfaces [028d2410-947f-fb83-b6ad-000000000012] 15896 1727203863.99708: sending task result for task 028d2410-947f-fb83-b6ad-000000000012 ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.239456", "end": "2024-09-24 14:51:03.942824", "rc": 0, "start": "2024-09-24 14:51:02.703368" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 15896 1727203863.99911: no more pending results, returning what we have 15896 1727203863.99915: results queue empty 15896 1727203863.99915: checking for any_errors_fatal 15896 1727203863.99923: done checking for any_errors_fatal 15896 1727203863.99924: checking for max_fail_percentage 15896 1727203863.99926: done checking for max_fail_percentage 15896 1727203863.99926: checking to see if all hosts have failed and the running result is not ok 15896 1727203863.99927: done checking to see if all hosts have failed 15896 1727203863.99928: getting the remaining hosts for this loop 15896 1727203863.99929: done getting the remaining hosts for this loop 15896 1727203863.99933: getting the next task for host managed-node1 15896 1727203863.99942: done getting next task for host managed-node1 15896 1727203863.99945: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15896 1727203863.99949: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203863.99952: getting variables 15896 1727203863.99954: in VariableManager get_vars() 15896 1727203864.00222: Calling all_inventory to load vars for managed-node1 15896 1727203864.00225: Calling groups_inventory to load vars for managed-node1 15896 1727203864.00228: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203864.00239: Calling all_plugins_play to load vars for managed-node1 15896 1727203864.00242: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203864.00245: Calling groups_plugins_play to load vars for managed-node1 15896 1727203864.00625: done sending task result for task 028d2410-947f-fb83-b6ad-000000000012 15896 1727203864.00629: WORKER PROCESS EXITING 15896 1727203864.00645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203864.01240: done with get_vars() 15896 1727203864.01251: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:51:04 -0400 (0:00:01.922) 0:00:09.602 ***** 15896 1727203864.01341: entering _queue_task() for managed-node1/include_tasks 15896 1727203864.01739: worker is 1 (out of 1 available) 15896 1727203864.01751: exiting _queue_task() for managed-node1/include_tasks 15896 1727203864.01766: done queuing things up, now waiting for results queue to drain 15896 1727203864.01768: waiting for pending results... 15896 1727203864.02395: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 15896 1727203864.02401: in run() - task 028d2410-947f-fb83-b6ad-000000000016 15896 1727203864.02408: variable 'ansible_search_path' from source: unknown 15896 1727203864.02413: variable 'ansible_search_path' from source: unknown 15896 1727203864.02588: calling self._execute() 15896 1727203864.02820: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.02824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.02828: variable 'omit' from source: magic vars 15896 1727203864.03536: variable 'ansible_distribution_major_version' from source: facts 15896 1727203864.03633: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203864.03676: _execute() done 15896 1727203864.03804: dumping result to json 15896 1727203864.03813: done dumping result, returning 15896 1727203864.03825: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-fb83-b6ad-000000000016] 15896 1727203864.03835: sending task result for task 028d2410-947f-fb83-b6ad-000000000016 15896 1727203864.04013: no more pending results, returning what we have 15896 1727203864.04018: in VariableManager get_vars() 15896 1727203864.04079: Calling all_inventory to load vars for managed-node1 15896 1727203864.04082: Calling groups_inventory to load vars for managed-node1 15896 1727203864.04085: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203864.04102: Calling all_plugins_play to load vars for managed-node1 15896 1727203864.04105: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203864.04108: Calling groups_plugins_play to load vars for managed-node1 15896 1727203864.04840: done sending task result for task 028d2410-947f-fb83-b6ad-000000000016 15896 1727203864.04843: WORKER PROCESS EXITING 15896 1727203864.04982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203864.05802: done with get_vars() 15896 1727203864.05814: variable 'ansible_search_path' from source: unknown 15896 1727203864.05815: variable 'ansible_search_path' from source: unknown 15896 1727203864.06507: we have included files to process 15896 1727203864.06509: generating all_blocks data 15896 1727203864.06511: done generating all_blocks data 15896 1727203864.06512: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15896 1727203864.06513: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15896 1727203864.06515: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15896 1727203864.07247: done processing included file 15896 1727203864.07250: iterating over new_blocks loaded from include file 15896 1727203864.07252: in VariableManager get_vars() 15896 1727203864.07288: done with get_vars() 15896 1727203864.07290: filtering new block on tags 15896 1727203864.07308: done filtering new block on tags 15896 1727203864.07311: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 15896 1727203864.07317: extending task lists for all hosts with included blocks 15896 1727203864.07633: done extending task lists 15896 1727203864.07635: done processing included files 15896 1727203864.07636: results queue empty 15896 1727203864.07636: checking for any_errors_fatal 15896 1727203864.07645: done checking for any_errors_fatal 15896 1727203864.07646: checking for max_fail_percentage 15896 1727203864.07647: done checking for max_fail_percentage 15896 1727203864.07648: checking to see if all hosts have failed and the running result is not ok 15896 1727203864.07649: done checking to see if all hosts have failed 15896 1727203864.07649: getting the remaining hosts for this loop 15896 1727203864.07651: done getting the remaining hosts for this loop 15896 1727203864.07654: getting the next task for host managed-node1 15896 1727203864.07661: done getting next task for host managed-node1 15896 1727203864.07664: ^ task is: TASK: Get stat for interface {{ interface }} 15896 1727203864.07674: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203864.07679: getting variables 15896 1727203864.07680: in VariableManager get_vars() 15896 1727203864.07707: Calling all_inventory to load vars for managed-node1 15896 1727203864.07710: Calling groups_inventory to load vars for managed-node1 15896 1727203864.07712: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203864.07718: Calling all_plugins_play to load vars for managed-node1 15896 1727203864.07721: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203864.07723: Calling groups_plugins_play to load vars for managed-node1 15896 1727203864.08121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203864.08550: done with get_vars() 15896 1727203864.08564: done getting variables 15896 1727203864.08940: variable 'interface' from source: task vars 15896 1727203864.08946: variable 'dhcp_interface1' from source: play vars 15896 1727203864.09220: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:51:04 -0400 (0:00:00.081) 0:00:09.684 ***** 15896 1727203864.09517: entering _queue_task() for managed-node1/stat 15896 1727203864.10309: worker is 1 (out of 1 available) 15896 1727203864.10322: exiting _queue_task() for managed-node1/stat 15896 1727203864.10335: done queuing things up, now waiting for results queue to drain 15896 1727203864.10337: waiting for pending results... 15896 1727203864.10739: running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 15896 1727203864.11054: in run() - task 028d2410-947f-fb83-b6ad-000000000248 15896 1727203864.11124: variable 'ansible_search_path' from source: unknown 15896 1727203864.11128: variable 'ansible_search_path' from source: unknown 15896 1727203864.11131: calling self._execute() 15896 1727203864.11303: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.11349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.11364: variable 'omit' from source: magic vars 15896 1727203864.12133: variable 'ansible_distribution_major_version' from source: facts 15896 1727203864.12362: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203864.12365: variable 'omit' from source: magic vars 15896 1727203864.12367: variable 'omit' from source: magic vars 15896 1727203864.12659: variable 'interface' from source: task vars 15896 1727203864.12681: variable 'dhcp_interface1' from source: play vars 15896 1727203864.12738: variable 'dhcp_interface1' from source: play vars 15896 1727203864.12764: variable 'omit' from source: magic vars 15896 1727203864.12881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203864.12884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203864.13480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203864.13484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203864.13487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203864.13490: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203864.13492: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.13494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.13497: Set connection var ansible_shell_type to sh 15896 1727203864.13499: Set connection var ansible_connection to ssh 15896 1727203864.13501: Set connection var ansible_shell_executable to /bin/sh 15896 1727203864.14180: Set connection var ansible_pipelining to False 15896 1727203864.14184: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203864.14187: Set connection var ansible_timeout to 10 15896 1727203864.14190: variable 'ansible_shell_executable' from source: unknown 15896 1727203864.14192: variable 'ansible_connection' from source: unknown 15896 1727203864.14194: variable 'ansible_module_compression' from source: unknown 15896 1727203864.14197: variable 'ansible_shell_type' from source: unknown 15896 1727203864.14199: variable 'ansible_shell_executable' from source: unknown 15896 1727203864.14201: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.14204: variable 'ansible_pipelining' from source: unknown 15896 1727203864.14206: variable 'ansible_timeout' from source: unknown 15896 1727203864.14208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.14458: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203864.14780: variable 'omit' from source: magic vars 15896 1727203864.14785: starting attempt loop 15896 1727203864.14787: running the handler 15896 1727203864.14791: _low_level_execute_command(): starting 15896 1727203864.14793: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203864.16041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203864.16137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203864.16156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203864.16186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203864.16317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203864.16515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.16616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.18517: stdout chunk (state=3): >>>/root <<< 15896 1727203864.18532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203864.18568: stderr chunk (state=3): >>><<< 15896 1727203864.18580: stdout chunk (state=3): >>><<< 15896 1727203864.18609: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203864.18681: _low_level_execute_command(): starting 15896 1727203864.18694: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790 `" && echo ansible-tmp-1727203864.186652-16974-19411615260790="` echo /root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790 `" ) && sleep 0' 15896 1727203864.20702: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203864.20998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.21373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.23512: stdout chunk (state=3): >>>ansible-tmp-1727203864.186652-16974-19411615260790=/root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790 <<< 15896 1727203864.23622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203864.23659: stderr chunk (state=3): >>><<< 15896 1727203864.23669: stdout chunk (state=3): >>><<< 15896 1727203864.23723: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203864.186652-16974-19411615260790=/root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203864.23834: variable 'ansible_module_compression' from source: unknown 15896 1727203864.24083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15896 1727203864.24086: variable 'ansible_facts' from source: unknown 15896 1727203864.24381: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/AnsiballZ_stat.py 15896 1727203864.25016: Sending initial data 15896 1727203864.25025: Sent initial data (151 bytes) 15896 1727203864.26024: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203864.26054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203864.26305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203864.26318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.26418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.28431: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203864.28507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203864.28590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpe8n7r27c /root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/AnsiballZ_stat.py <<< 15896 1727203864.28594: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/AnsiballZ_stat.py" <<< 15896 1727203864.28667: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpe8n7r27c" to remote "/root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/AnsiballZ_stat.py" <<< 15896 1727203864.30192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203864.30411: stderr chunk (state=3): >>><<< 15896 1727203864.30415: stdout chunk (state=3): >>><<< 15896 1727203864.30524: done transferring module to remote 15896 1727203864.30542: _low_level_execute_command(): starting 15896 1727203864.30555: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/ /root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/AnsiballZ_stat.py && sleep 0' 15896 1727203864.32199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203864.32216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203864.32226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203864.32292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203864.32304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203864.32517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.32596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.35011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203864.35016: stdout chunk (state=3): >>><<< 15896 1727203864.35019: stderr chunk (state=3): >>><<< 15896 1727203864.35021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203864.35030: _low_level_execute_command(): starting 15896 1727203864.35032: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/AnsiballZ_stat.py && sleep 0' 15896 1727203864.36267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203864.36271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203864.36274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203864.36329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203864.36348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.36452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.53053: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27835, "dev": 23, "nlink": 1, "atime": 1727203862.7102246, "mtime": 1727203862.7102246, "ctime": 1727203862.7102246, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15896 1727203864.54898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203864.54908: stdout chunk (state=3): >>><<< 15896 1727203864.54914: stderr chunk (state=3): >>><<< 15896 1727203864.54932: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27835, "dev": 23, "nlink": 1, "atime": 1727203862.7102246, "mtime": 1727203862.7102246, "ctime": 1727203862.7102246, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203864.55197: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203864.55207: _low_level_execute_command(): starting 15896 1727203864.55212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203864.186652-16974-19411615260790/ > /dev/null 2>&1 && sleep 0' 15896 1727203864.56497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203864.56510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203864.56523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203864.56538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203864.56552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203864.56564: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203864.56581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203864.56598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203864.56607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203864.56616: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203864.56701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203864.56714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203864.56732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.56834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.58938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203864.58948: stdout chunk (state=3): >>><<< 15896 1727203864.58980: stderr chunk (state=3): >>><<< 15896 1727203864.59006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203864.59017: handler run complete 15896 1727203864.59072: attempt loop complete, returning result 15896 1727203864.59083: _execute() done 15896 1727203864.59091: dumping result to json 15896 1727203864.59107: done dumping result, returning 15896 1727203864.59120: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 [028d2410-947f-fb83-b6ad-000000000248] 15896 1727203864.59128: sending task result for task 028d2410-947f-fb83-b6ad-000000000248 15896 1727203864.59484: done sending task result for task 028d2410-947f-fb83-b6ad-000000000248 15896 1727203864.59487: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727203862.7102246, "block_size": 4096, "blocks": 0, "ctime": 1727203862.7102246, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27835, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727203862.7102246, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15896 1727203864.59582: no more pending results, returning what we have 15896 1727203864.59585: results queue empty 15896 1727203864.59586: checking for any_errors_fatal 15896 1727203864.59588: done checking for any_errors_fatal 15896 1727203864.59588: checking for max_fail_percentage 15896 1727203864.59590: done checking for max_fail_percentage 15896 1727203864.59591: checking to see if all hosts have failed and the running result is not ok 15896 1727203864.59591: done checking to see if all hosts have failed 15896 1727203864.59592: getting the remaining hosts for this loop 15896 1727203864.59593: done getting the remaining hosts for this loop 15896 1727203864.59599: getting the next task for host managed-node1 15896 1727203864.59606: done getting next task for host managed-node1 15896 1727203864.59609: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15896 1727203864.59611: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203864.59616: getting variables 15896 1727203864.59618: in VariableManager get_vars() 15896 1727203864.59668: Calling all_inventory to load vars for managed-node1 15896 1727203864.59672: Calling groups_inventory to load vars for managed-node1 15896 1727203864.59674: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203864.59690: Calling all_plugins_play to load vars for managed-node1 15896 1727203864.59693: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203864.59696: Calling groups_plugins_play to load vars for managed-node1 15896 1727203864.59984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203864.60305: done with get_vars() 15896 1727203864.60318: done getting variables 15896 1727203864.60483: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15896 1727203864.60630: variable 'interface' from source: task vars 15896 1727203864.60634: variable 'dhcp_interface1' from source: play vars 15896 1727203864.60701: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:51:04 -0400 (0:00:00.512) 0:00:10.196 ***** 15896 1727203864.60732: entering _queue_task() for managed-node1/assert 15896 1727203864.60734: Creating lock for assert 15896 1727203864.61066: worker is 1 (out of 1 available) 15896 1727203864.61209: exiting _queue_task() for managed-node1/assert 15896 1727203864.61220: done queuing things up, now waiting for results queue to drain 15896 1727203864.61222: waiting for pending results... 15896 1727203864.61373: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' 15896 1727203864.61497: in run() - task 028d2410-947f-fb83-b6ad-000000000017 15896 1727203864.61516: variable 'ansible_search_path' from source: unknown 15896 1727203864.61522: variable 'ansible_search_path' from source: unknown 15896 1727203864.61566: calling self._execute() 15896 1727203864.61751: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.61770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.61859: variable 'omit' from source: magic vars 15896 1727203864.62450: variable 'ansible_distribution_major_version' from source: facts 15896 1727203864.62470: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203864.62484: variable 'omit' from source: magic vars 15896 1727203864.62619: variable 'omit' from source: magic vars 15896 1727203864.62945: variable 'interface' from source: task vars 15896 1727203864.62949: variable 'dhcp_interface1' from source: play vars 15896 1727203864.62951: variable 'dhcp_interface1' from source: play vars 15896 1727203864.62991: variable 'omit' from source: magic vars 15896 1727203864.63036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203864.63199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203864.63224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203864.63245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203864.63286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203864.63320: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203864.63662: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.63665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.63690: Set connection var ansible_shell_type to sh 15896 1727203864.63701: Set connection var ansible_connection to ssh 15896 1727203864.63708: Set connection var ansible_shell_executable to /bin/sh 15896 1727203864.63716: Set connection var ansible_pipelining to False 15896 1727203864.63724: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203864.63731: Set connection var ansible_timeout to 10 15896 1727203864.63798: variable 'ansible_shell_executable' from source: unknown 15896 1727203864.63805: variable 'ansible_connection' from source: unknown 15896 1727203864.63811: variable 'ansible_module_compression' from source: unknown 15896 1727203864.63879: variable 'ansible_shell_type' from source: unknown 15896 1727203864.63890: variable 'ansible_shell_executable' from source: unknown 15896 1727203864.63897: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.63906: variable 'ansible_pipelining' from source: unknown 15896 1727203864.63913: variable 'ansible_timeout' from source: unknown 15896 1727203864.63920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.64132: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203864.64148: variable 'omit' from source: magic vars 15896 1727203864.64161: starting attempt loop 15896 1727203864.64169: running the handler 15896 1727203864.64315: variable 'interface_stat' from source: set_fact 15896 1727203864.64340: Evaluated conditional (interface_stat.stat.exists): True 15896 1727203864.64353: handler run complete 15896 1727203864.64377: attempt loop complete, returning result 15896 1727203864.64385: _execute() done 15896 1727203864.64392: dumping result to json 15896 1727203864.64400: done dumping result, returning 15896 1727203864.64411: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' [028d2410-947f-fb83-b6ad-000000000017] 15896 1727203864.64419: sending task result for task 028d2410-947f-fb83-b6ad-000000000017 15896 1727203864.64546: done sending task result for task 028d2410-947f-fb83-b6ad-000000000017 15896 1727203864.64549: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203864.64668: no more pending results, returning what we have 15896 1727203864.64671: results queue empty 15896 1727203864.64672: checking for any_errors_fatal 15896 1727203864.64681: done checking for any_errors_fatal 15896 1727203864.64682: checking for max_fail_percentage 15896 1727203864.64684: done checking for max_fail_percentage 15896 1727203864.64684: checking to see if all hosts have failed and the running result is not ok 15896 1727203864.64685: done checking to see if all hosts have failed 15896 1727203864.64686: getting the remaining hosts for this loop 15896 1727203864.64688: done getting the remaining hosts for this loop 15896 1727203864.64706: getting the next task for host managed-node1 15896 1727203864.64715: done getting next task for host managed-node1 15896 1727203864.64719: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15896 1727203864.64722: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203864.64726: getting variables 15896 1727203864.64728: in VariableManager get_vars() 15896 1727203864.64879: Calling all_inventory to load vars for managed-node1 15896 1727203864.64882: Calling groups_inventory to load vars for managed-node1 15896 1727203864.64885: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203864.64894: Calling all_plugins_play to load vars for managed-node1 15896 1727203864.64896: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203864.64899: Calling groups_plugins_play to load vars for managed-node1 15896 1727203864.65052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203864.65443: done with get_vars() 15896 1727203864.65456: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:51:04 -0400 (0:00:00.048) 0:00:10.244 ***** 15896 1727203864.65550: entering _queue_task() for managed-node1/include_tasks 15896 1727203864.65978: worker is 1 (out of 1 available) 15896 1727203864.65990: exiting _queue_task() for managed-node1/include_tasks 15896 1727203864.66004: done queuing things up, now waiting for results queue to drain 15896 1727203864.66005: waiting for pending results... 15896 1727203864.66436: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 15896 1727203864.66528: in run() - task 028d2410-947f-fb83-b6ad-00000000001b 15896 1727203864.66535: variable 'ansible_search_path' from source: unknown 15896 1727203864.66538: variable 'ansible_search_path' from source: unknown 15896 1727203864.66644: calling self._execute() 15896 1727203864.66688: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.66699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.66714: variable 'omit' from source: magic vars 15896 1727203864.67099: variable 'ansible_distribution_major_version' from source: facts 15896 1727203864.67116: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203864.67126: _execute() done 15896 1727203864.67135: dumping result to json 15896 1727203864.67144: done dumping result, returning 15896 1727203864.67155: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-fb83-b6ad-00000000001b] 15896 1727203864.67165: sending task result for task 028d2410-947f-fb83-b6ad-00000000001b 15896 1727203864.67311: no more pending results, returning what we have 15896 1727203864.67317: in VariableManager get_vars() 15896 1727203864.67379: Calling all_inventory to load vars for managed-node1 15896 1727203864.67382: Calling groups_inventory to load vars for managed-node1 15896 1727203864.67385: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203864.67399: Calling all_plugins_play to load vars for managed-node1 15896 1727203864.67402: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203864.67405: Calling groups_plugins_play to load vars for managed-node1 15896 1727203864.67824: done sending task result for task 028d2410-947f-fb83-b6ad-00000000001b 15896 1727203864.67828: WORKER PROCESS EXITING 15896 1727203864.67848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203864.68255: done with get_vars() 15896 1727203864.68263: variable 'ansible_search_path' from source: unknown 15896 1727203864.68264: variable 'ansible_search_path' from source: unknown 15896 1727203864.68299: we have included files to process 15896 1727203864.68300: generating all_blocks data 15896 1727203864.68301: done generating all_blocks data 15896 1727203864.68304: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15896 1727203864.68305: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15896 1727203864.68307: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15896 1727203864.68510: done processing included file 15896 1727203864.68513: iterating over new_blocks loaded from include file 15896 1727203864.68515: in VariableManager get_vars() 15896 1727203864.68542: done with get_vars() 15896 1727203864.68544: filtering new block on tags 15896 1727203864.68564: done filtering new block on tags 15896 1727203864.68567: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 15896 1727203864.68573: extending task lists for all hosts with included blocks 15896 1727203864.68666: done extending task lists 15896 1727203864.68667: done processing included files 15896 1727203864.68668: results queue empty 15896 1727203864.68669: checking for any_errors_fatal 15896 1727203864.68673: done checking for any_errors_fatal 15896 1727203864.68673: checking for max_fail_percentage 15896 1727203864.68675: done checking for max_fail_percentage 15896 1727203864.68677: checking to see if all hosts have failed and the running result is not ok 15896 1727203864.68678: done checking to see if all hosts have failed 15896 1727203864.68678: getting the remaining hosts for this loop 15896 1727203864.68680: done getting the remaining hosts for this loop 15896 1727203864.68682: getting the next task for host managed-node1 15896 1727203864.68686: done getting next task for host managed-node1 15896 1727203864.68688: ^ task is: TASK: Get stat for interface {{ interface }} 15896 1727203864.68690: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203864.68693: getting variables 15896 1727203864.68694: in VariableManager get_vars() 15896 1727203864.68713: Calling all_inventory to load vars for managed-node1 15896 1727203864.68715: Calling groups_inventory to load vars for managed-node1 15896 1727203864.68717: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203864.68723: Calling all_plugins_play to load vars for managed-node1 15896 1727203864.68725: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203864.68728: Calling groups_plugins_play to load vars for managed-node1 15896 1727203864.68863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203864.69056: done with get_vars() 15896 1727203864.69065: done getting variables 15896 1727203864.69226: variable 'interface' from source: task vars 15896 1727203864.69229: variable 'dhcp_interface2' from source: play vars 15896 1727203864.69290: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:51:04 -0400 (0:00:00.037) 0:00:10.282 ***** 15896 1727203864.69325: entering _queue_task() for managed-node1/stat 15896 1727203864.69775: worker is 1 (out of 1 available) 15896 1727203864.69868: exiting _queue_task() for managed-node1/stat 15896 1727203864.69881: done queuing things up, now waiting for results queue to drain 15896 1727203864.69883: waiting for pending results... 15896 1727203864.70054: running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 15896 1727203864.70194: in run() - task 028d2410-947f-fb83-b6ad-000000000260 15896 1727203864.70219: variable 'ansible_search_path' from source: unknown 15896 1727203864.70227: variable 'ansible_search_path' from source: unknown 15896 1727203864.70266: calling self._execute() 15896 1727203864.70363: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.70377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.70390: variable 'omit' from source: magic vars 15896 1727203864.70990: variable 'ansible_distribution_major_version' from source: facts 15896 1727203864.71006: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203864.71017: variable 'omit' from source: magic vars 15896 1727203864.71081: variable 'omit' from source: magic vars 15896 1727203864.71189: variable 'interface' from source: task vars 15896 1727203864.71200: variable 'dhcp_interface2' from source: play vars 15896 1727203864.71301: variable 'dhcp_interface2' from source: play vars 15896 1727203864.71304: variable 'omit' from source: magic vars 15896 1727203864.71334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203864.71371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203864.71396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203864.71420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203864.71627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203864.71630: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203864.71633: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.71635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.71637: Set connection var ansible_shell_type to sh 15896 1727203864.71639: Set connection var ansible_connection to ssh 15896 1727203864.71641: Set connection var ansible_shell_executable to /bin/sh 15896 1727203864.71643: Set connection var ansible_pipelining to False 15896 1727203864.71645: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203864.71647: Set connection var ansible_timeout to 10 15896 1727203864.71736: variable 'ansible_shell_executable' from source: unknown 15896 1727203864.71739: variable 'ansible_connection' from source: unknown 15896 1727203864.71742: variable 'ansible_module_compression' from source: unknown 15896 1727203864.71744: variable 'ansible_shell_type' from source: unknown 15896 1727203864.71747: variable 'ansible_shell_executable' from source: unknown 15896 1727203864.71749: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203864.71751: variable 'ansible_pipelining' from source: unknown 15896 1727203864.71753: variable 'ansible_timeout' from source: unknown 15896 1727203864.71761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203864.71978: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203864.71992: variable 'omit' from source: magic vars 15896 1727203864.72003: starting attempt loop 15896 1727203864.72006: running the handler 15896 1727203864.72021: _low_level_execute_command(): starting 15896 1727203864.72034: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203864.72880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203864.72903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.73018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.74936: stdout chunk (state=3): >>>/root <<< 15896 1727203864.75082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203864.75085: stdout chunk (state=3): >>><<< 15896 1727203864.75087: stderr chunk (state=3): >>><<< 15896 1727203864.75108: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203864.75123: _low_level_execute_command(): starting 15896 1727203864.75131: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136 `" && echo ansible-tmp-1727203864.7510839-17059-163976547738136="` echo /root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136 `" ) && sleep 0' 15896 1727203864.75745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203864.75753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203864.75771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203864.75782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203864.75794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203864.75801: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203864.75811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203864.75881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203864.75884: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203864.75887: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203864.75888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203864.75890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203864.75892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203864.75894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203864.75896: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203864.75898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203864.75940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203864.76039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203864.76053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.76159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.78326: stdout chunk (state=3): >>>ansible-tmp-1727203864.7510839-17059-163976547738136=/root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136 <<< 15896 1727203864.78508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203864.78512: stdout chunk (state=3): >>><<< 15896 1727203864.78514: stderr chunk (state=3): >>><<< 15896 1727203864.78533: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203864.7510839-17059-163976547738136=/root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203864.78683: variable 'ansible_module_compression' from source: unknown 15896 1727203864.78686: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15896 1727203864.78698: variable 'ansible_facts' from source: unknown 15896 1727203864.78804: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/AnsiballZ_stat.py 15896 1727203864.79005: Sending initial data 15896 1727203864.79008: Sent initial data (153 bytes) 15896 1727203864.79888: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203864.79895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203864.79906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203864.79998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203864.80007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203864.80019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203864.80055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.80200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.81963: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15896 1727203864.82000: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203864.82065: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203864.82147: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpysyf6as9 /root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/AnsiballZ_stat.py <<< 15896 1727203864.82151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/AnsiballZ_stat.py" <<< 15896 1727203864.82219: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpysyf6as9" to remote "/root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/AnsiballZ_stat.py" <<< 15896 1727203864.83279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203864.83282: stdout chunk (state=3): >>><<< 15896 1727203864.83284: stderr chunk (state=3): >>><<< 15896 1727203864.83285: done transferring module to remote 15896 1727203864.83287: _low_level_execute_command(): starting 15896 1727203864.83289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/ /root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/AnsiballZ_stat.py && sleep 0' 15896 1727203864.84072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203864.84130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203864.84154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203864.84177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.84264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203864.86393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203864.86397: stdout chunk (state=3): >>><<< 15896 1727203864.86400: stderr chunk (state=3): >>><<< 15896 1727203864.86403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203864.86405: _low_level_execute_command(): starting 15896 1727203864.86407: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/AnsiballZ_stat.py && sleep 0' 15896 1727203864.86989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203864.87004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203864.87020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203864.87046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203864.87161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203864.87193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203864.87314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203865.03710: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28241, "dev": 23, "nlink": 1, "atime": 1727203862.7170672, "mtime": 1727203862.7170672, "ctime": 1727203862.7170672, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15896 1727203865.05280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203865.05289: stdout chunk (state=3): >>><<< 15896 1727203865.05292: stderr chunk (state=3): >>><<< 15896 1727203865.05451: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28241, "dev": 23, "nlink": 1, "atime": 1727203862.7170672, "mtime": 1727203862.7170672, "ctime": 1727203862.7170672, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203865.05455: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203865.05458: _low_level_execute_command(): starting 15896 1727203865.05463: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203864.7510839-17059-163976547738136/ > /dev/null 2>&1 && sleep 0' 15896 1727203865.06016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203865.06027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203865.06039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203865.06054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203865.06071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203865.06085: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203865.06100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203865.06116: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203865.06126: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203865.06134: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203865.06143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203865.06153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203865.06193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203865.06249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203865.06267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203865.06292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203865.06399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203865.08419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203865.08432: stdout chunk (state=3): >>><<< 15896 1727203865.08442: stderr chunk (state=3): >>><<< 15896 1727203865.08466: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203865.08482: handler run complete 15896 1727203865.08537: attempt loop complete, returning result 15896 1727203865.08550: _execute() done 15896 1727203865.08561: dumping result to json 15896 1727203865.08572: done dumping result, returning 15896 1727203865.08587: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 [028d2410-947f-fb83-b6ad-000000000260] 15896 1727203865.08651: sending task result for task 028d2410-947f-fb83-b6ad-000000000260 15896 1727203865.08741: done sending task result for task 028d2410-947f-fb83-b6ad-000000000260 15896 1727203865.08744: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727203862.7170672, "block_size": 4096, "blocks": 0, "ctime": 1727203862.7170672, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28241, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727203862.7170672, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15896 1727203865.09053: no more pending results, returning what we have 15896 1727203865.09057: results queue empty 15896 1727203865.09058: checking for any_errors_fatal 15896 1727203865.09062: done checking for any_errors_fatal 15896 1727203865.09063: checking for max_fail_percentage 15896 1727203865.09065: done checking for max_fail_percentage 15896 1727203865.09066: checking to see if all hosts have failed and the running result is not ok 15896 1727203865.09066: done checking to see if all hosts have failed 15896 1727203865.09067: getting the remaining hosts for this loop 15896 1727203865.09069: done getting the remaining hosts for this loop 15896 1727203865.09073: getting the next task for host managed-node1 15896 1727203865.09083: done getting next task for host managed-node1 15896 1727203865.09086: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15896 1727203865.09089: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203865.09093: getting variables 15896 1727203865.09095: in VariableManager get_vars() 15896 1727203865.09153: Calling all_inventory to load vars for managed-node1 15896 1727203865.09155: Calling groups_inventory to load vars for managed-node1 15896 1727203865.09158: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203865.09173: Calling all_plugins_play to load vars for managed-node1 15896 1727203865.09291: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203865.09296: Calling groups_plugins_play to load vars for managed-node1 15896 1727203865.09568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203865.09752: done with get_vars() 15896 1727203865.09765: done getting variables 15896 1727203865.09824: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203865.09948: variable 'interface' from source: task vars 15896 1727203865.09952: variable 'dhcp_interface2' from source: play vars 15896 1727203865.10015: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:51:05 -0400 (0:00:00.407) 0:00:10.689 ***** 15896 1727203865.10053: entering _queue_task() for managed-node1/assert 15896 1727203865.10349: worker is 1 (out of 1 available) 15896 1727203865.10363: exiting _queue_task() for managed-node1/assert 15896 1727203865.10479: done queuing things up, now waiting for results queue to drain 15896 1727203865.10482: waiting for pending results... 15896 1727203865.10792: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' 15896 1727203865.10797: in run() - task 028d2410-947f-fb83-b6ad-00000000001c 15896 1727203865.10800: variable 'ansible_search_path' from source: unknown 15896 1727203865.10802: variable 'ansible_search_path' from source: unknown 15896 1727203865.10836: calling self._execute() 15896 1727203865.10934: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.10945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.10958: variable 'omit' from source: magic vars 15896 1727203865.11337: variable 'ansible_distribution_major_version' from source: facts 15896 1727203865.11379: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203865.11386: variable 'omit' from source: magic vars 15896 1727203865.11429: variable 'omit' from source: magic vars 15896 1727203865.11538: variable 'interface' from source: task vars 15896 1727203865.11580: variable 'dhcp_interface2' from source: play vars 15896 1727203865.11627: variable 'dhcp_interface2' from source: play vars 15896 1727203865.11649: variable 'omit' from source: magic vars 15896 1727203865.11703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203865.11747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203865.11780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203865.11820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203865.11822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203865.11857: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203865.11905: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.11909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.11984: Set connection var ansible_shell_type to sh 15896 1727203865.11997: Set connection var ansible_connection to ssh 15896 1727203865.12006: Set connection var ansible_shell_executable to /bin/sh 15896 1727203865.12021: Set connection var ansible_pipelining to False 15896 1727203865.12078: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203865.12085: Set connection var ansible_timeout to 10 15896 1727203865.12087: variable 'ansible_shell_executable' from source: unknown 15896 1727203865.12089: variable 'ansible_connection' from source: unknown 15896 1727203865.12091: variable 'ansible_module_compression' from source: unknown 15896 1727203865.12093: variable 'ansible_shell_type' from source: unknown 15896 1727203865.12094: variable 'ansible_shell_executable' from source: unknown 15896 1727203865.12096: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.12099: variable 'ansible_pipelining' from source: unknown 15896 1727203865.12107: variable 'ansible_timeout' from source: unknown 15896 1727203865.12114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.12267: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203865.12286: variable 'omit' from source: magic vars 15896 1727203865.12300: starting attempt loop 15896 1727203865.12342: running the handler 15896 1727203865.12451: variable 'interface_stat' from source: set_fact 15896 1727203865.12484: Evaluated conditional (interface_stat.stat.exists): True 15896 1727203865.12494: handler run complete 15896 1727203865.12512: attempt loop complete, returning result 15896 1727203865.12518: _execute() done 15896 1727203865.12525: dumping result to json 15896 1727203865.12562: done dumping result, returning 15896 1727203865.12565: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' [028d2410-947f-fb83-b6ad-00000000001c] 15896 1727203865.12567: sending task result for task 028d2410-947f-fb83-b6ad-00000000001c ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203865.12824: no more pending results, returning what we have 15896 1727203865.12828: results queue empty 15896 1727203865.12829: checking for any_errors_fatal 15896 1727203865.12836: done checking for any_errors_fatal 15896 1727203865.12837: checking for max_fail_percentage 15896 1727203865.12839: done checking for max_fail_percentage 15896 1727203865.12840: checking to see if all hosts have failed and the running result is not ok 15896 1727203865.12840: done checking to see if all hosts have failed 15896 1727203865.12841: getting the remaining hosts for this loop 15896 1727203865.12843: done getting the remaining hosts for this loop 15896 1727203865.12846: getting the next task for host managed-node1 15896 1727203865.12854: done getting next task for host managed-node1 15896 1727203865.12856: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 15896 1727203865.12861: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203865.12864: getting variables 15896 1727203865.12866: in VariableManager get_vars() 15896 1727203865.12923: Calling all_inventory to load vars for managed-node1 15896 1727203865.12926: Calling groups_inventory to load vars for managed-node1 15896 1727203865.12929: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203865.12940: Calling all_plugins_play to load vars for managed-node1 15896 1727203865.12943: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203865.12946: Calling groups_plugins_play to load vars for managed-node1 15896 1727203865.13311: done sending task result for task 028d2410-947f-fb83-b6ad-00000000001c 15896 1727203865.13314: WORKER PROCESS EXITING 15896 1727203865.13336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203865.13549: done with get_vars() 15896 1727203865.13562: done getting variables 15896 1727203865.13621: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Tuesday 24 September 2024 14:51:05 -0400 (0:00:00.036) 0:00:10.725 ***** 15896 1727203865.13653: entering _queue_task() for managed-node1/command 15896 1727203865.14002: worker is 1 (out of 1 available) 15896 1727203865.14012: exiting _queue_task() for managed-node1/command 15896 1727203865.14023: done queuing things up, now waiting for results queue to drain 15896 1727203865.14024: waiting for pending results... 15896 1727203865.14250: running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript 15896 1727203865.14408: in run() - task 028d2410-947f-fb83-b6ad-00000000001d 15896 1727203865.14428: variable 'ansible_search_path' from source: unknown 15896 1727203865.14471: calling self._execute() 15896 1727203865.14569: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.14626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.14640: variable 'omit' from source: magic vars 15896 1727203865.16065: variable 'ansible_distribution_major_version' from source: facts 15896 1727203865.16087: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203865.16206: variable 'network_provider' from source: set_fact 15896 1727203865.16218: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203865.16226: when evaluation is False, skipping this task 15896 1727203865.16233: _execute() done 15896 1727203865.16239: dumping result to json 15896 1727203865.16246: done dumping result, returning 15896 1727203865.16266: done running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript [028d2410-947f-fb83-b6ad-00000000001d] 15896 1727203865.16275: sending task result for task 028d2410-947f-fb83-b6ad-00000000001d 15896 1727203865.16436: done sending task result for task 028d2410-947f-fb83-b6ad-00000000001d 15896 1727203865.16439: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203865.16522: no more pending results, returning what we have 15896 1727203865.16526: results queue empty 15896 1727203865.16527: checking for any_errors_fatal 15896 1727203865.16534: done checking for any_errors_fatal 15896 1727203865.16534: checking for max_fail_percentage 15896 1727203865.16536: done checking for max_fail_percentage 15896 1727203865.16537: checking to see if all hosts have failed and the running result is not ok 15896 1727203865.16537: done checking to see if all hosts have failed 15896 1727203865.16538: getting the remaining hosts for this loop 15896 1727203865.16540: done getting the remaining hosts for this loop 15896 1727203865.16543: getting the next task for host managed-node1 15896 1727203865.16548: done getting next task for host managed-node1 15896 1727203865.16551: ^ task is: TASK: TEST Add Bond with 2 ports 15896 1727203865.16554: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203865.16557: getting variables 15896 1727203865.16561: in VariableManager get_vars() 15896 1727203865.16623: Calling all_inventory to load vars for managed-node1 15896 1727203865.16626: Calling groups_inventory to load vars for managed-node1 15896 1727203865.16628: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203865.16642: Calling all_plugins_play to load vars for managed-node1 15896 1727203865.16645: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203865.16648: Calling groups_plugins_play to load vars for managed-node1 15896 1727203865.17327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203865.17523: done with get_vars() 15896 1727203865.17533: done getting variables 15896 1727203865.17595: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Tuesday 24 September 2024 14:51:05 -0400 (0:00:00.039) 0:00:10.765 ***** 15896 1727203865.17619: entering _queue_task() for managed-node1/debug 15896 1727203865.18170: worker is 1 (out of 1 available) 15896 1727203865.18184: exiting _queue_task() for managed-node1/debug 15896 1727203865.18193: done queuing things up, now waiting for results queue to drain 15896 1727203865.18195: waiting for pending results... 15896 1727203865.18536: running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports 15896 1727203865.18685: in run() - task 028d2410-947f-fb83-b6ad-00000000001e 15896 1727203865.18868: variable 'ansible_search_path' from source: unknown 15896 1727203865.18871: calling self._execute() 15896 1727203865.19430: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.19434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.19437: variable 'omit' from source: magic vars 15896 1727203865.20281: variable 'ansible_distribution_major_version' from source: facts 15896 1727203865.20285: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203865.20287: variable 'omit' from source: magic vars 15896 1727203865.20422: variable 'omit' from source: magic vars 15896 1727203865.20521: variable 'omit' from source: magic vars 15896 1727203865.20783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203865.20787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203865.20789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203865.20792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203865.20794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203865.20932: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203865.20989: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.20999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.21195: Set connection var ansible_shell_type to sh 15896 1727203865.21199: Set connection var ansible_connection to ssh 15896 1727203865.21201: Set connection var ansible_shell_executable to /bin/sh 15896 1727203865.21203: Set connection var ansible_pipelining to False 15896 1727203865.21205: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203865.21207: Set connection var ansible_timeout to 10 15896 1727203865.21209: variable 'ansible_shell_executable' from source: unknown 15896 1727203865.21211: variable 'ansible_connection' from source: unknown 15896 1727203865.21213: variable 'ansible_module_compression' from source: unknown 15896 1727203865.21215: variable 'ansible_shell_type' from source: unknown 15896 1727203865.21217: variable 'ansible_shell_executable' from source: unknown 15896 1727203865.21219: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.21220: variable 'ansible_pipelining' from source: unknown 15896 1727203865.21222: variable 'ansible_timeout' from source: unknown 15896 1727203865.21224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.21509: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203865.21592: variable 'omit' from source: magic vars 15896 1727203865.21648: starting attempt loop 15896 1727203865.21655: running the handler 15896 1727203865.21768: handler run complete 15896 1727203865.21793: attempt loop complete, returning result 15896 1727203865.21801: _execute() done 15896 1727203865.21808: dumping result to json 15896 1727203865.21814: done dumping result, returning 15896 1727203865.21826: done running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports [028d2410-947f-fb83-b6ad-00000000001e] 15896 1727203865.21869: sending task result for task 028d2410-947f-fb83-b6ad-00000000001e ok: [managed-node1] => {} MSG: ################################################## 15896 1727203865.22047: no more pending results, returning what we have 15896 1727203865.22050: results queue empty 15896 1727203865.22051: checking for any_errors_fatal 15896 1727203865.22058: done checking for any_errors_fatal 15896 1727203865.22059: checking for max_fail_percentage 15896 1727203865.22061: done checking for max_fail_percentage 15896 1727203865.22061: checking to see if all hosts have failed and the running result is not ok 15896 1727203865.22062: done checking to see if all hosts have failed 15896 1727203865.22062: getting the remaining hosts for this loop 15896 1727203865.22064: done getting the remaining hosts for this loop 15896 1727203865.22067: getting the next task for host managed-node1 15896 1727203865.22074: done getting next task for host managed-node1 15896 1727203865.22082: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203865.22086: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203865.22104: getting variables 15896 1727203865.22106: in VariableManager get_vars() 15896 1727203865.22156: Calling all_inventory to load vars for managed-node1 15896 1727203865.22159: Calling groups_inventory to load vars for managed-node1 15896 1727203865.22161: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203865.22173: Calling all_plugins_play to load vars for managed-node1 15896 1727203865.22389: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203865.22581: Calling groups_plugins_play to load vars for managed-node1 15896 1727203865.22803: done sending task result for task 028d2410-947f-fb83-b6ad-00000000001e 15896 1727203865.22807: WORKER PROCESS EXITING 15896 1727203865.22830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203865.23091: done with get_vars() 15896 1727203865.23102: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:51:05 -0400 (0:00:00.055) 0:00:10.821 ***** 15896 1727203865.23197: entering _queue_task() for managed-node1/include_tasks 15896 1727203865.23510: worker is 1 (out of 1 available) 15896 1727203865.23521: exiting _queue_task() for managed-node1/include_tasks 15896 1727203865.23532: done queuing things up, now waiting for results queue to drain 15896 1727203865.23534: waiting for pending results... 15896 1727203865.24010: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203865.24197: in run() - task 028d2410-947f-fb83-b6ad-000000000026 15896 1727203865.24219: variable 'ansible_search_path' from source: unknown 15896 1727203865.24231: variable 'ansible_search_path' from source: unknown 15896 1727203865.24311: calling self._execute() 15896 1727203865.24491: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.24495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.24497: variable 'omit' from source: magic vars 15896 1727203865.25005: variable 'ansible_distribution_major_version' from source: facts 15896 1727203865.25024: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203865.25035: _execute() done 15896 1727203865.25043: dumping result to json 15896 1727203865.25050: done dumping result, returning 15896 1727203865.25066: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-fb83-b6ad-000000000026] 15896 1727203865.25078: sending task result for task 028d2410-947f-fb83-b6ad-000000000026 15896 1727203865.25382: done sending task result for task 028d2410-947f-fb83-b6ad-000000000026 15896 1727203865.25386: WORKER PROCESS EXITING 15896 1727203865.25426: no more pending results, returning what we have 15896 1727203865.25432: in VariableManager get_vars() 15896 1727203865.25499: Calling all_inventory to load vars for managed-node1 15896 1727203865.25502: Calling groups_inventory to load vars for managed-node1 15896 1727203865.25505: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203865.25520: Calling all_plugins_play to load vars for managed-node1 15896 1727203865.25523: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203865.25526: Calling groups_plugins_play to load vars for managed-node1 15896 1727203865.26110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203865.26422: done with get_vars() 15896 1727203865.26431: variable 'ansible_search_path' from source: unknown 15896 1727203865.26432: variable 'ansible_search_path' from source: unknown 15896 1727203865.26579: we have included files to process 15896 1727203865.26581: generating all_blocks data 15896 1727203865.26583: done generating all_blocks data 15896 1727203865.26588: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203865.26589: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203865.26593: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203865.27943: done processing included file 15896 1727203865.27946: iterating over new_blocks loaded from include file 15896 1727203865.27947: in VariableManager get_vars() 15896 1727203865.27987: done with get_vars() 15896 1727203865.27989: filtering new block on tags 15896 1727203865.28006: done filtering new block on tags 15896 1727203865.28009: in VariableManager get_vars() 15896 1727203865.28037: done with get_vars() 15896 1727203865.28039: filtering new block on tags 15896 1727203865.28061: done filtering new block on tags 15896 1727203865.28064: in VariableManager get_vars() 15896 1727203865.28096: done with get_vars() 15896 1727203865.28098: filtering new block on tags 15896 1727203865.28114: done filtering new block on tags 15896 1727203865.28117: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 15896 1727203865.28122: extending task lists for all hosts with included blocks 15896 1727203865.29148: done extending task lists 15896 1727203865.29150: done processing included files 15896 1727203865.29151: results queue empty 15896 1727203865.29152: checking for any_errors_fatal 15896 1727203865.29155: done checking for any_errors_fatal 15896 1727203865.29156: checking for max_fail_percentage 15896 1727203865.29157: done checking for max_fail_percentage 15896 1727203865.29158: checking to see if all hosts have failed and the running result is not ok 15896 1727203865.29161: done checking to see if all hosts have failed 15896 1727203865.29162: getting the remaining hosts for this loop 15896 1727203865.29164: done getting the remaining hosts for this loop 15896 1727203865.29166: getting the next task for host managed-node1 15896 1727203865.29171: done getting next task for host managed-node1 15896 1727203865.29174: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203865.29179: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203865.29190: getting variables 15896 1727203865.29192: in VariableManager get_vars() 15896 1727203865.29218: Calling all_inventory to load vars for managed-node1 15896 1727203865.29221: Calling groups_inventory to load vars for managed-node1 15896 1727203865.29223: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203865.29230: Calling all_plugins_play to load vars for managed-node1 15896 1727203865.29233: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203865.29236: Calling groups_plugins_play to load vars for managed-node1 15896 1727203865.29455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203865.29724: done with get_vars() 15896 1727203865.29735: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:51:05 -0400 (0:00:00.066) 0:00:10.887 ***** 15896 1727203865.29818: entering _queue_task() for managed-node1/setup 15896 1727203865.30752: worker is 1 (out of 1 available) 15896 1727203865.30765: exiting _queue_task() for managed-node1/setup 15896 1727203865.30774: done queuing things up, now waiting for results queue to drain 15896 1727203865.30779: waiting for pending results... 15896 1727203865.30916: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203865.31246: in run() - task 028d2410-947f-fb83-b6ad-00000000027e 15896 1727203865.31250: variable 'ansible_search_path' from source: unknown 15896 1727203865.31252: variable 'ansible_search_path' from source: unknown 15896 1727203865.31255: calling self._execute() 15896 1727203865.31300: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.31314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.31347: variable 'omit' from source: magic vars 15896 1727203865.31735: variable 'ansible_distribution_major_version' from source: facts 15896 1727203865.31752: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203865.31971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203865.34288: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203865.34403: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203865.34582: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203865.34586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203865.34589: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203865.34788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203865.34818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203865.34843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203865.34885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203865.34900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203865.34953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203865.34977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203865.35110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203865.35147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203865.35164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203865.35327: variable '__network_required_facts' from source: role '' defaults 15896 1727203865.35339: variable 'ansible_facts' from source: unknown 15896 1727203865.35442: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15896 1727203865.35445: when evaluation is False, skipping this task 15896 1727203865.35448: _execute() done 15896 1727203865.35450: dumping result to json 15896 1727203865.35453: done dumping result, returning 15896 1727203865.35464: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-fb83-b6ad-00000000027e] 15896 1727203865.35466: sending task result for task 028d2410-947f-fb83-b6ad-00000000027e 15896 1727203865.35562: done sending task result for task 028d2410-947f-fb83-b6ad-00000000027e 15896 1727203865.35564: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203865.35610: no more pending results, returning what we have 15896 1727203865.35613: results queue empty 15896 1727203865.35614: checking for any_errors_fatal 15896 1727203865.35615: done checking for any_errors_fatal 15896 1727203865.35616: checking for max_fail_percentage 15896 1727203865.35617: done checking for max_fail_percentage 15896 1727203865.35618: checking to see if all hosts have failed and the running result is not ok 15896 1727203865.35618: done checking to see if all hosts have failed 15896 1727203865.35619: getting the remaining hosts for this loop 15896 1727203865.35621: done getting the remaining hosts for this loop 15896 1727203865.35626: getting the next task for host managed-node1 15896 1727203865.35634: done getting next task for host managed-node1 15896 1727203865.35637: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203865.35641: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203865.35655: getting variables 15896 1727203865.35657: in VariableManager get_vars() 15896 1727203865.35711: Calling all_inventory to load vars for managed-node1 15896 1727203865.35714: Calling groups_inventory to load vars for managed-node1 15896 1727203865.35716: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203865.35726: Calling all_plugins_play to load vars for managed-node1 15896 1727203865.35728: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203865.35731: Calling groups_plugins_play to load vars for managed-node1 15896 1727203865.36103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203865.36532: done with get_vars() 15896 1727203865.36545: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:51:05 -0400 (0:00:00.068) 0:00:10.955 ***** 15896 1727203865.36654: entering _queue_task() for managed-node1/stat 15896 1727203865.36950: worker is 1 (out of 1 available) 15896 1727203865.36965: exiting _queue_task() for managed-node1/stat 15896 1727203865.37078: done queuing things up, now waiting for results queue to drain 15896 1727203865.37081: waiting for pending results... 15896 1727203865.37392: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203865.37397: in run() - task 028d2410-947f-fb83-b6ad-000000000280 15896 1727203865.37411: variable 'ansible_search_path' from source: unknown 15896 1727203865.37417: variable 'ansible_search_path' from source: unknown 15896 1727203865.37456: calling self._execute() 15896 1727203865.37539: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.37550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.37565: variable 'omit' from source: magic vars 15896 1727203865.37943: variable 'ansible_distribution_major_version' from source: facts 15896 1727203865.37967: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203865.38132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203865.38487: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203865.38539: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203865.38607: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203865.38622: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203865.38714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203865.38747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203865.38831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203865.38834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203865.38907: variable '__network_is_ostree' from source: set_fact 15896 1727203865.38919: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203865.38927: when evaluation is False, skipping this task 15896 1727203865.38939: _execute() done 15896 1727203865.38973: dumping result to json 15896 1727203865.38984: done dumping result, returning 15896 1727203865.38996: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-fb83-b6ad-000000000280] 15896 1727203865.39048: sending task result for task 028d2410-947f-fb83-b6ad-000000000280 15896 1727203865.39121: done sending task result for task 028d2410-947f-fb83-b6ad-000000000280 15896 1727203865.39124: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203865.39206: no more pending results, returning what we have 15896 1727203865.39211: results queue empty 15896 1727203865.39212: checking for any_errors_fatal 15896 1727203865.39218: done checking for any_errors_fatal 15896 1727203865.39219: checking for max_fail_percentage 15896 1727203865.39221: done checking for max_fail_percentage 15896 1727203865.39221: checking to see if all hosts have failed and the running result is not ok 15896 1727203865.39222: done checking to see if all hosts have failed 15896 1727203865.39223: getting the remaining hosts for this loop 15896 1727203865.39225: done getting the remaining hosts for this loop 15896 1727203865.39228: getting the next task for host managed-node1 15896 1727203865.39235: done getting next task for host managed-node1 15896 1727203865.39239: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203865.39243: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203865.39257: getting variables 15896 1727203865.39261: in VariableManager get_vars() 15896 1727203865.39320: Calling all_inventory to load vars for managed-node1 15896 1727203865.39323: Calling groups_inventory to load vars for managed-node1 15896 1727203865.39325: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203865.39336: Calling all_plugins_play to load vars for managed-node1 15896 1727203865.39338: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203865.39341: Calling groups_plugins_play to load vars for managed-node1 15896 1727203865.39837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203865.40056: done with get_vars() 15896 1727203865.40068: done getting variables 15896 1727203865.40123: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:51:05 -0400 (0:00:00.035) 0:00:10.990 ***** 15896 1727203865.40150: entering _queue_task() for managed-node1/set_fact 15896 1727203865.40353: worker is 1 (out of 1 available) 15896 1727203865.40368: exiting _queue_task() for managed-node1/set_fact 15896 1727203865.40381: done queuing things up, now waiting for results queue to drain 15896 1727203865.40383: waiting for pending results... 15896 1727203865.40539: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203865.40632: in run() - task 028d2410-947f-fb83-b6ad-000000000281 15896 1727203865.40643: variable 'ansible_search_path' from source: unknown 15896 1727203865.40646: variable 'ansible_search_path' from source: unknown 15896 1727203865.40678: calling self._execute() 15896 1727203865.40738: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.40742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.40750: variable 'omit' from source: magic vars 15896 1727203865.41015: variable 'ansible_distribution_major_version' from source: facts 15896 1727203865.41024: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203865.41148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203865.41337: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203865.41368: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203865.41397: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203865.41422: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203865.41501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203865.41518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203865.41535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203865.41553: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203865.41621: variable '__network_is_ostree' from source: set_fact 15896 1727203865.41627: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203865.41630: when evaluation is False, skipping this task 15896 1727203865.41633: _execute() done 15896 1727203865.41636: dumping result to json 15896 1727203865.41638: done dumping result, returning 15896 1727203865.41646: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-fb83-b6ad-000000000281] 15896 1727203865.41650: sending task result for task 028d2410-947f-fb83-b6ad-000000000281 15896 1727203865.41731: done sending task result for task 028d2410-947f-fb83-b6ad-000000000281 15896 1727203865.41734: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203865.41785: no more pending results, returning what we have 15896 1727203865.41789: results queue empty 15896 1727203865.41789: checking for any_errors_fatal 15896 1727203865.41795: done checking for any_errors_fatal 15896 1727203865.41796: checking for max_fail_percentage 15896 1727203865.41797: done checking for max_fail_percentage 15896 1727203865.41798: checking to see if all hosts have failed and the running result is not ok 15896 1727203865.41799: done checking to see if all hosts have failed 15896 1727203865.41800: getting the remaining hosts for this loop 15896 1727203865.41801: done getting the remaining hosts for this loop 15896 1727203865.41804: getting the next task for host managed-node1 15896 1727203865.41814: done getting next task for host managed-node1 15896 1727203865.41817: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203865.41821: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203865.41833: getting variables 15896 1727203865.41835: in VariableManager get_vars() 15896 1727203865.41887: Calling all_inventory to load vars for managed-node1 15896 1727203865.41890: Calling groups_inventory to load vars for managed-node1 15896 1727203865.41892: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203865.41904: Calling all_plugins_play to load vars for managed-node1 15896 1727203865.41907: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203865.41910: Calling groups_plugins_play to load vars for managed-node1 15896 1727203865.42081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203865.42297: done with get_vars() 15896 1727203865.42312: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:51:05 -0400 (0:00:00.022) 0:00:11.013 ***** 15896 1727203865.42406: entering _queue_task() for managed-node1/service_facts 15896 1727203865.42407: Creating lock for service_facts 15896 1727203865.42845: worker is 1 (out of 1 available) 15896 1727203865.42857: exiting _queue_task() for managed-node1/service_facts 15896 1727203865.42872: done queuing things up, now waiting for results queue to drain 15896 1727203865.42874: waiting for pending results... 15896 1727203865.43345: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203865.43350: in run() - task 028d2410-947f-fb83-b6ad-000000000283 15896 1727203865.43353: variable 'ansible_search_path' from source: unknown 15896 1727203865.43357: variable 'ansible_search_path' from source: unknown 15896 1727203865.43363: calling self._execute() 15896 1727203865.43429: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.43439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.43444: variable 'omit' from source: magic vars 15896 1727203865.43769: variable 'ansible_distribution_major_version' from source: facts 15896 1727203865.43778: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203865.43784: variable 'omit' from source: magic vars 15896 1727203865.43848: variable 'omit' from source: magic vars 15896 1727203865.43879: variable 'omit' from source: magic vars 15896 1727203865.43915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203865.43950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203865.43982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203865.44052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203865.44090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203865.44094: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203865.44096: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.44098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.44185: Set connection var ansible_shell_type to sh 15896 1727203865.44191: Set connection var ansible_connection to ssh 15896 1727203865.44310: Set connection var ansible_shell_executable to /bin/sh 15896 1727203865.44314: Set connection var ansible_pipelining to False 15896 1727203865.44317: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203865.44320: Set connection var ansible_timeout to 10 15896 1727203865.44324: variable 'ansible_shell_executable' from source: unknown 15896 1727203865.44328: variable 'ansible_connection' from source: unknown 15896 1727203865.44331: variable 'ansible_module_compression' from source: unknown 15896 1727203865.44334: variable 'ansible_shell_type' from source: unknown 15896 1727203865.44336: variable 'ansible_shell_executable' from source: unknown 15896 1727203865.44339: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203865.44341: variable 'ansible_pipelining' from source: unknown 15896 1727203865.44344: variable 'ansible_timeout' from source: unknown 15896 1727203865.44346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203865.44438: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203865.44446: variable 'omit' from source: magic vars 15896 1727203865.44451: starting attempt loop 15896 1727203865.44453: running the handler 15896 1727203865.44471: _low_level_execute_command(): starting 15896 1727203865.44480: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203865.45271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203865.45282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203865.45299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203865.45304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203865.45319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203865.45324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203865.45336: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203865.45341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203865.45417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203865.45436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203865.45539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203865.47341: stdout chunk (state=3): >>>/root <<< 15896 1727203865.47446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203865.47469: stderr chunk (state=3): >>><<< 15896 1727203865.47472: stdout chunk (state=3): >>><<< 15896 1727203865.47493: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203865.47504: _low_level_execute_command(): starting 15896 1727203865.47510: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273 `" && echo ansible-tmp-1727203865.4749227-17141-104466944678273="` echo /root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273 `" ) && sleep 0' 15896 1727203865.48133: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203865.48163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203865.48280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203865.50394: stdout chunk (state=3): >>>ansible-tmp-1727203865.4749227-17141-104466944678273=/root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273 <<< 15896 1727203865.50506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203865.50530: stderr chunk (state=3): >>><<< 15896 1727203865.50533: stdout chunk (state=3): >>><<< 15896 1727203865.50548: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203865.4749227-17141-104466944678273=/root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203865.50594: variable 'ansible_module_compression' from source: unknown 15896 1727203865.50628: ANSIBALLZ: Using lock for service_facts 15896 1727203865.50631: ANSIBALLZ: Acquiring lock 15896 1727203865.50633: ANSIBALLZ: Lock acquired: 140082267861360 15896 1727203865.50636: ANSIBALLZ: Creating module 15896 1727203865.62182: ANSIBALLZ: Writing module into payload 15896 1727203865.62433: ANSIBALLZ: Writing module 15896 1727203865.62449: ANSIBALLZ: Renaming module 15896 1727203865.62505: ANSIBALLZ: Done creating module 15896 1727203865.62678: variable 'ansible_facts' from source: unknown 15896 1727203865.62682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/AnsiballZ_service_facts.py 15896 1727203865.63115: Sending initial data 15896 1727203865.63118: Sent initial data (162 bytes) 15896 1727203865.64096: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203865.64199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203865.64435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203865.64611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203865.66389: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203865.66494: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203865.66600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpinxbimla /root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/AnsiballZ_service_facts.py <<< 15896 1727203865.66603: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/AnsiballZ_service_facts.py" <<< 15896 1727203865.66699: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpinxbimla" to remote "/root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/AnsiballZ_service_facts.py" <<< 15896 1727203865.68057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203865.68064: stdout chunk (state=3): >>><<< 15896 1727203865.68067: stderr chunk (state=3): >>><<< 15896 1727203865.68070: done transferring module to remote 15896 1727203865.68072: _low_level_execute_command(): starting 15896 1727203865.68074: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/ /root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/AnsiballZ_service_facts.py && sleep 0' 15896 1727203865.69100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203865.69310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203865.69395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203865.71580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203865.71584: stdout chunk (state=3): >>><<< 15896 1727203865.71586: stderr chunk (state=3): >>><<< 15896 1727203865.71589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203865.71597: _low_level_execute_command(): starting 15896 1727203865.71599: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/AnsiballZ_service_facts.py && sleep 0' 15896 1727203865.72164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203865.72183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203865.72199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203865.72223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203865.72246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203865.72262: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203865.72340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203865.72381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203865.72405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203865.72419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203865.72543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203867.49742: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 15896 1727203867.49770: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 15896 1727203867.49785: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 15896 1727203867.49821: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 15896 1727203867.49827: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 15896 1727203867.49829: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15896 1727203867.51623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203867.51650: stderr chunk (state=3): >>><<< 15896 1727203867.51653: stdout chunk (state=3): >>><<< 15896 1727203867.51680: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203867.52055: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203867.52068: _low_level_execute_command(): starting 15896 1727203867.52071: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203865.4749227-17141-104466944678273/ > /dev/null 2>&1 && sleep 0' 15896 1727203867.52518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203867.52522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203867.52524: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203867.52526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203867.52528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203867.52573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203867.52581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203867.52656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203867.54638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203867.54662: stderr chunk (state=3): >>><<< 15896 1727203867.54668: stdout chunk (state=3): >>><<< 15896 1727203867.54684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203867.54690: handler run complete 15896 1727203867.54803: variable 'ansible_facts' from source: unknown 15896 1727203867.55703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203867.55965: variable 'ansible_facts' from source: unknown 15896 1727203867.56068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203867.56288: attempt loop complete, returning result 15896 1727203867.56291: _execute() done 15896 1727203867.56293: dumping result to json 15896 1727203867.56309: done dumping result, returning 15896 1727203867.56319: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-fb83-b6ad-000000000283] 15896 1727203867.56322: sending task result for task 028d2410-947f-fb83-b6ad-000000000283 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203867.57247: no more pending results, returning what we have 15896 1727203867.57250: results queue empty 15896 1727203867.57251: checking for any_errors_fatal 15896 1727203867.57254: done checking for any_errors_fatal 15896 1727203867.57254: checking for max_fail_percentage 15896 1727203867.57256: done checking for max_fail_percentage 15896 1727203867.57257: checking to see if all hosts have failed and the running result is not ok 15896 1727203867.57257: done checking to see if all hosts have failed 15896 1727203867.57261: getting the remaining hosts for this loop 15896 1727203867.57262: done getting the remaining hosts for this loop 15896 1727203867.57265: getting the next task for host managed-node1 15896 1727203867.57269: done getting next task for host managed-node1 15896 1727203867.57272: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203867.57278: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203867.57287: getting variables 15896 1727203867.57288: in VariableManager get_vars() 15896 1727203867.57325: Calling all_inventory to load vars for managed-node1 15896 1727203867.57328: Calling groups_inventory to load vars for managed-node1 15896 1727203867.57330: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203867.57338: Calling all_plugins_play to load vars for managed-node1 15896 1727203867.57341: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203867.57344: Calling groups_plugins_play to load vars for managed-node1 15896 1727203867.57709: done sending task result for task 028d2410-947f-fb83-b6ad-000000000283 15896 1727203867.57712: WORKER PROCESS EXITING 15896 1727203867.57734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203867.58219: done with get_vars() 15896 1727203867.58234: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:51:07 -0400 (0:00:02.159) 0:00:13.172 ***** 15896 1727203867.58337: entering _queue_task() for managed-node1/package_facts 15896 1727203867.58339: Creating lock for package_facts 15896 1727203867.58641: worker is 1 (out of 1 available) 15896 1727203867.58654: exiting _queue_task() for managed-node1/package_facts 15896 1727203867.58668: done queuing things up, now waiting for results queue to drain 15896 1727203867.58670: waiting for pending results... 15896 1727203867.58931: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203867.59079: in run() - task 028d2410-947f-fb83-b6ad-000000000284 15896 1727203867.59103: variable 'ansible_search_path' from source: unknown 15896 1727203867.59111: variable 'ansible_search_path' from source: unknown 15896 1727203867.59147: calling self._execute() 15896 1727203867.59238: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203867.59250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203867.59264: variable 'omit' from source: magic vars 15896 1727203867.59622: variable 'ansible_distribution_major_version' from source: facts 15896 1727203867.59640: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203867.59654: variable 'omit' from source: magic vars 15896 1727203867.59728: variable 'omit' from source: magic vars 15896 1727203867.59769: variable 'omit' from source: magic vars 15896 1727203867.59815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203867.59853: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203867.59882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203867.59904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203867.60081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203867.60084: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203867.60087: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203867.60089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203867.60092: Set connection var ansible_shell_type to sh 15896 1727203867.60094: Set connection var ansible_connection to ssh 15896 1727203867.60095: Set connection var ansible_shell_executable to /bin/sh 15896 1727203867.60097: Set connection var ansible_pipelining to False 15896 1727203867.60099: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203867.60101: Set connection var ansible_timeout to 10 15896 1727203867.60125: variable 'ansible_shell_executable' from source: unknown 15896 1727203867.60134: variable 'ansible_connection' from source: unknown 15896 1727203867.60142: variable 'ansible_module_compression' from source: unknown 15896 1727203867.60150: variable 'ansible_shell_type' from source: unknown 15896 1727203867.60156: variable 'ansible_shell_executable' from source: unknown 15896 1727203867.60163: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203867.60170: variable 'ansible_pipelining' from source: unknown 15896 1727203867.60179: variable 'ansible_timeout' from source: unknown 15896 1727203867.60190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203867.60411: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203867.60431: variable 'omit' from source: magic vars 15896 1727203867.60442: starting attempt loop 15896 1727203867.60448: running the handler 15896 1727203867.60467: _low_level_execute_command(): starting 15896 1727203867.60483: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203867.61399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203867.61443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203867.61471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203867.61494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203867.61743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203867.63454: stdout chunk (state=3): >>>/root <<< 15896 1727203867.63643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203867.63658: stdout chunk (state=3): >>><<< 15896 1727203867.63674: stderr chunk (state=3): >>><<< 15896 1727203867.64004: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203867.64009: _low_level_execute_command(): starting 15896 1727203867.64012: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597 `" && echo ansible-tmp-1727203867.639044-17289-102965208409597="` echo /root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597 `" ) && sleep 0' 15896 1727203867.65214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203867.65229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203867.65318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203867.65512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203867.65529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203867.65751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203867.65786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203867.67930: stdout chunk (state=3): >>>ansible-tmp-1727203867.639044-17289-102965208409597=/root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597 <<< 15896 1727203867.68074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203867.68099: stdout chunk (state=3): >>><<< 15896 1727203867.68113: stderr chunk (state=3): >>><<< 15896 1727203867.68135: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203867.639044-17289-102965208409597=/root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203867.68197: variable 'ansible_module_compression' from source: unknown 15896 1727203867.68250: ANSIBALLZ: Using lock for package_facts 15896 1727203867.68259: ANSIBALLZ: Acquiring lock 15896 1727203867.68266: ANSIBALLZ: Lock acquired: 140082268928864 15896 1727203867.68274: ANSIBALLZ: Creating module 15896 1727203868.12734: ANSIBALLZ: Writing module into payload 15896 1727203868.12935: ANSIBALLZ: Writing module 15896 1727203868.12990: ANSIBALLZ: Renaming module 15896 1727203868.12994: ANSIBALLZ: Done creating module 15896 1727203868.13034: variable 'ansible_facts' from source: unknown 15896 1727203868.13270: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/AnsiballZ_package_facts.py 15896 1727203868.13539: Sending initial data 15896 1727203868.13544: Sent initial data (161 bytes) 15896 1727203868.14009: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203868.14012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203868.14015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203868.14017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203868.14071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203868.14074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203868.14160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203868.15911: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203868.15994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203868.16099: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpny_gt8p1 /root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/AnsiballZ_package_facts.py <<< 15896 1727203868.16102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/AnsiballZ_package_facts.py" <<< 15896 1727203868.16185: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpny_gt8p1" to remote "/root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/AnsiballZ_package_facts.py" <<< 15896 1727203868.17509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203868.17556: stderr chunk (state=3): >>><<< 15896 1727203868.17559: stdout chunk (state=3): >>><<< 15896 1727203868.17580: done transferring module to remote 15896 1727203868.17591: _low_level_execute_command(): starting 15896 1727203868.17595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/ /root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/AnsiballZ_package_facts.py && sleep 0' 15896 1727203868.18124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203868.18133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203868.18232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203868.20318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203868.20325: stdout chunk (state=3): >>><<< 15896 1727203868.20328: stderr chunk (state=3): >>><<< 15896 1727203868.20426: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203868.20430: _low_level_execute_command(): starting 15896 1727203868.20432: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/AnsiballZ_package_facts.py && sleep 0' 15896 1727203868.21015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203868.21018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203868.21078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203868.21098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203868.21152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203868.21182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203868.21209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203868.21329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203868.68644: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el1<<< 15896 1727203868.68710: stdout chunk (state=3): >>>0", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 15896 1727203868.68718: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 15896 1727203868.68824: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15896 1727203868.70790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203868.70795: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 15896 1727203868.70847: stdout chunk (state=3): >>><<< 15896 1727203868.70851: stderr chunk (state=3): >>><<< 15896 1727203868.70864: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203868.72533: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203868.72537: _low_level_execute_command(): starting 15896 1727203868.72539: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203867.639044-17289-102965208409597/ > /dev/null 2>&1 && sleep 0' 15896 1727203868.72981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203868.72988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203868.73014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203868.73017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203868.73019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203868.73021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203868.73071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203868.73079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203868.73234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203868.75180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203868.75212: stderr chunk (state=3): >>><<< 15896 1727203868.75215: stdout chunk (state=3): >>><<< 15896 1727203868.75245: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203868.75250: handler run complete 15896 1727203868.76431: variable 'ansible_facts' from source: unknown 15896 1727203868.76688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203868.78741: variable 'ansible_facts' from source: unknown 15896 1727203868.79008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203868.79402: attempt loop complete, returning result 15896 1727203868.79412: _execute() done 15896 1727203868.79414: dumping result to json 15896 1727203868.79532: done dumping result, returning 15896 1727203868.79540: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-fb83-b6ad-000000000284] 15896 1727203868.79543: sending task result for task 028d2410-947f-fb83-b6ad-000000000284 15896 1727203868.87637: done sending task result for task 028d2410-947f-fb83-b6ad-000000000284 15896 1727203868.87641: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203868.87739: no more pending results, returning what we have 15896 1727203868.87742: results queue empty 15896 1727203868.87743: checking for any_errors_fatal 15896 1727203868.87746: done checking for any_errors_fatal 15896 1727203868.87747: checking for max_fail_percentage 15896 1727203868.87748: done checking for max_fail_percentage 15896 1727203868.87749: checking to see if all hosts have failed and the running result is not ok 15896 1727203868.87750: done checking to see if all hosts have failed 15896 1727203868.87750: getting the remaining hosts for this loop 15896 1727203868.87752: done getting the remaining hosts for this loop 15896 1727203868.87755: getting the next task for host managed-node1 15896 1727203868.87761: done getting next task for host managed-node1 15896 1727203868.87764: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203868.87767: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203868.87780: getting variables 15896 1727203868.87782: in VariableManager get_vars() 15896 1727203868.87887: Calling all_inventory to load vars for managed-node1 15896 1727203868.87890: Calling groups_inventory to load vars for managed-node1 15896 1727203868.87896: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203868.87906: Calling all_plugins_play to load vars for managed-node1 15896 1727203868.87909: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203868.87912: Calling groups_plugins_play to load vars for managed-node1 15896 1727203868.89523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203868.91009: done with get_vars() 15896 1727203868.91035: done getting variables 15896 1727203868.91104: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:51:08 -0400 (0:00:01.328) 0:00:14.500 ***** 15896 1727203868.91142: entering _queue_task() for managed-node1/debug 15896 1727203868.91468: worker is 1 (out of 1 available) 15896 1727203868.91684: exiting _queue_task() for managed-node1/debug 15896 1727203868.91697: done queuing things up, now waiting for results queue to drain 15896 1727203868.91699: waiting for pending results... 15896 1727203868.91867: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203868.92204: in run() - task 028d2410-947f-fb83-b6ad-000000000027 15896 1727203868.92208: variable 'ansible_search_path' from source: unknown 15896 1727203868.92211: variable 'ansible_search_path' from source: unknown 15896 1727203868.92214: calling self._execute() 15896 1727203868.92349: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203868.92580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203868.92583: variable 'omit' from source: magic vars 15896 1727203868.93163: variable 'ansible_distribution_major_version' from source: facts 15896 1727203868.93227: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203868.93431: variable 'omit' from source: magic vars 15896 1727203868.93434: variable 'omit' from source: magic vars 15896 1727203868.93662: variable 'network_provider' from source: set_fact 15896 1727203868.93687: variable 'omit' from source: magic vars 15896 1727203868.93731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203868.93800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203868.93829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203868.93877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203868.93895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203868.93930: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203868.93938: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203868.93946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203868.94050: Set connection var ansible_shell_type to sh 15896 1727203868.94066: Set connection var ansible_connection to ssh 15896 1727203868.94077: Set connection var ansible_shell_executable to /bin/sh 15896 1727203868.94090: Set connection var ansible_pipelining to False 15896 1727203868.94099: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203868.94110: Set connection var ansible_timeout to 10 15896 1727203868.94136: variable 'ansible_shell_executable' from source: unknown 15896 1727203868.94144: variable 'ansible_connection' from source: unknown 15896 1727203868.94150: variable 'ansible_module_compression' from source: unknown 15896 1727203868.94156: variable 'ansible_shell_type' from source: unknown 15896 1727203868.94166: variable 'ansible_shell_executable' from source: unknown 15896 1727203868.94172: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203868.94181: variable 'ansible_pipelining' from source: unknown 15896 1727203868.94194: variable 'ansible_timeout' from source: unknown 15896 1727203868.94203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203868.94411: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203868.94415: variable 'omit' from source: magic vars 15896 1727203868.94417: starting attempt loop 15896 1727203868.94419: running the handler 15896 1727203868.94426: handler run complete 15896 1727203868.94443: attempt loop complete, returning result 15896 1727203868.94450: _execute() done 15896 1727203868.94461: dumping result to json 15896 1727203868.94470: done dumping result, returning 15896 1727203868.94483: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-fb83-b6ad-000000000027] 15896 1727203868.94491: sending task result for task 028d2410-947f-fb83-b6ad-000000000027 ok: [managed-node1] => {} MSG: Using network provider: nm 15896 1727203868.94686: no more pending results, returning what we have 15896 1727203868.94690: results queue empty 15896 1727203868.94690: checking for any_errors_fatal 15896 1727203868.94702: done checking for any_errors_fatal 15896 1727203868.94703: checking for max_fail_percentage 15896 1727203868.94705: done checking for max_fail_percentage 15896 1727203868.94705: checking to see if all hosts have failed and the running result is not ok 15896 1727203868.94706: done checking to see if all hosts have failed 15896 1727203868.94707: getting the remaining hosts for this loop 15896 1727203868.94708: done getting the remaining hosts for this loop 15896 1727203868.94712: getting the next task for host managed-node1 15896 1727203868.94719: done getting next task for host managed-node1 15896 1727203868.94723: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203868.94727: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203868.94737: getting variables 15896 1727203868.94739: in VariableManager get_vars() 15896 1727203868.94900: Calling all_inventory to load vars for managed-node1 15896 1727203868.94903: Calling groups_inventory to load vars for managed-node1 15896 1727203868.94906: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203868.95085: Calling all_plugins_play to load vars for managed-node1 15896 1727203868.95088: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203868.95091: Calling groups_plugins_play to load vars for managed-node1 15896 1727203868.95789: done sending task result for task 028d2410-947f-fb83-b6ad-000000000027 15896 1727203868.95793: WORKER PROCESS EXITING 15896 1727203868.96454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203868.99320: done with get_vars() 15896 1727203868.99353: done getting variables 15896 1727203868.99653: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:51:08 -0400 (0:00:00.085) 0:00:14.586 ***** 15896 1727203868.99690: entering _queue_task() for managed-node1/fail 15896 1727203868.99692: Creating lock for fail 15896 1727203869.00434: worker is 1 (out of 1 available) 15896 1727203869.00448: exiting _queue_task() for managed-node1/fail 15896 1727203869.00463: done queuing things up, now waiting for results queue to drain 15896 1727203869.00465: waiting for pending results... 15896 1727203869.00951: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203869.01297: in run() - task 028d2410-947f-fb83-b6ad-000000000028 15896 1727203869.01318: variable 'ansible_search_path' from source: unknown 15896 1727203869.01325: variable 'ansible_search_path' from source: unknown 15896 1727203869.01368: calling self._execute() 15896 1727203869.01681: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203869.01684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203869.01687: variable 'omit' from source: magic vars 15896 1727203869.02341: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.02357: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203869.02567: variable 'network_state' from source: role '' defaults 15896 1727203869.02635: Evaluated conditional (network_state != {}): False 15896 1727203869.02642: when evaluation is False, skipping this task 15896 1727203869.02649: _execute() done 15896 1727203869.02780: dumping result to json 15896 1727203869.02783: done dumping result, returning 15896 1727203869.02787: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-fb83-b6ad-000000000028] 15896 1727203869.02790: sending task result for task 028d2410-947f-fb83-b6ad-000000000028 15896 1727203869.03181: done sending task result for task 028d2410-947f-fb83-b6ad-000000000028 15896 1727203869.03185: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203869.03231: no more pending results, returning what we have 15896 1727203869.03235: results queue empty 15896 1727203869.03236: checking for any_errors_fatal 15896 1727203869.03242: done checking for any_errors_fatal 15896 1727203869.03243: checking for max_fail_percentage 15896 1727203869.03244: done checking for max_fail_percentage 15896 1727203869.03245: checking to see if all hosts have failed and the running result is not ok 15896 1727203869.03246: done checking to see if all hosts have failed 15896 1727203869.03246: getting the remaining hosts for this loop 15896 1727203869.03248: done getting the remaining hosts for this loop 15896 1727203869.03252: getting the next task for host managed-node1 15896 1727203869.03257: done getting next task for host managed-node1 15896 1727203869.03264: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203869.03267: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203869.03284: getting variables 15896 1727203869.03286: in VariableManager get_vars() 15896 1727203869.03340: Calling all_inventory to load vars for managed-node1 15896 1727203869.03342: Calling groups_inventory to load vars for managed-node1 15896 1727203869.03345: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203869.03356: Calling all_plugins_play to load vars for managed-node1 15896 1727203869.03362: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203869.03365: Calling groups_plugins_play to load vars for managed-node1 15896 1727203869.06181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203869.09404: done with get_vars() 15896 1727203869.09437: done getting variables 15896 1727203869.09609: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:51:09 -0400 (0:00:00.099) 0:00:14.685 ***** 15896 1727203869.09642: entering _queue_task() for managed-node1/fail 15896 1727203869.10494: worker is 1 (out of 1 available) 15896 1727203869.10505: exiting _queue_task() for managed-node1/fail 15896 1727203869.10518: done queuing things up, now waiting for results queue to drain 15896 1727203869.10519: waiting for pending results... 15896 1727203869.10768: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203869.11128: in run() - task 028d2410-947f-fb83-b6ad-000000000029 15896 1727203869.11132: variable 'ansible_search_path' from source: unknown 15896 1727203869.11134: variable 'ansible_search_path' from source: unknown 15896 1727203869.11217: calling self._execute() 15896 1727203869.11443: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203869.11562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203869.11566: variable 'omit' from source: magic vars 15896 1727203869.12325: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.12328: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203869.12559: variable 'network_state' from source: role '' defaults 15896 1727203869.12625: Evaluated conditional (network_state != {}): False 15896 1727203869.12635: when evaluation is False, skipping this task 15896 1727203869.12642: _execute() done 15896 1727203869.12657: dumping result to json 15896 1727203869.12671: done dumping result, returning 15896 1727203869.12686: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-fb83-b6ad-000000000029] 15896 1727203869.12696: sending task result for task 028d2410-947f-fb83-b6ad-000000000029 15896 1727203869.12885: done sending task result for task 028d2410-947f-fb83-b6ad-000000000029 15896 1727203869.12889: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203869.12938: no more pending results, returning what we have 15896 1727203869.12941: results queue empty 15896 1727203869.12942: checking for any_errors_fatal 15896 1727203869.12952: done checking for any_errors_fatal 15896 1727203869.12953: checking for max_fail_percentage 15896 1727203869.12955: done checking for max_fail_percentage 15896 1727203869.12955: checking to see if all hosts have failed and the running result is not ok 15896 1727203869.12956: done checking to see if all hosts have failed 15896 1727203869.12957: getting the remaining hosts for this loop 15896 1727203869.12961: done getting the remaining hosts for this loop 15896 1727203869.12964: getting the next task for host managed-node1 15896 1727203869.12969: done getting next task for host managed-node1 15896 1727203869.12974: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203869.12979: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203869.12994: getting variables 15896 1727203869.12995: in VariableManager get_vars() 15896 1727203869.13047: Calling all_inventory to load vars for managed-node1 15896 1727203869.13049: Calling groups_inventory to load vars for managed-node1 15896 1727203869.13052: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203869.13065: Calling all_plugins_play to load vars for managed-node1 15896 1727203869.13067: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203869.13070: Calling groups_plugins_play to load vars for managed-node1 15896 1727203869.15685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203869.17251: done with get_vars() 15896 1727203869.17284: done getting variables 15896 1727203869.17346: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:51:09 -0400 (0:00:00.077) 0:00:14.763 ***** 15896 1727203869.17386: entering _queue_task() for managed-node1/fail 15896 1727203869.17730: worker is 1 (out of 1 available) 15896 1727203869.17741: exiting _queue_task() for managed-node1/fail 15896 1727203869.17753: done queuing things up, now waiting for results queue to drain 15896 1727203869.17755: waiting for pending results... 15896 1727203869.18192: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203869.18197: in run() - task 028d2410-947f-fb83-b6ad-00000000002a 15896 1727203869.18200: variable 'ansible_search_path' from source: unknown 15896 1727203869.18203: variable 'ansible_search_path' from source: unknown 15896 1727203869.18240: calling self._execute() 15896 1727203869.18338: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203869.18349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203869.18365: variable 'omit' from source: magic vars 15896 1727203869.18737: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.18757: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203869.18935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203869.21357: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203869.21583: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203869.21587: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203869.21589: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203869.21592: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203869.21637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.21674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.21710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.21754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.21785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.21896: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.21919: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15896 1727203869.22057: variable 'ansible_distribution' from source: facts 15896 1727203869.22072: variable '__network_rh_distros' from source: role '' defaults 15896 1727203869.22089: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15896 1727203869.22353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.22394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.22428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.22480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.22502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.22553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.22588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.22616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.22661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.22780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.22783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.22786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.22788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.22829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.22848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.23179: variable 'network_connections' from source: task vars 15896 1727203869.23196: variable 'controller_profile' from source: play vars 15896 1727203869.23268: variable 'controller_profile' from source: play vars 15896 1727203869.23285: variable 'controller_device' from source: play vars 15896 1727203869.23351: variable 'controller_device' from source: play vars 15896 1727203869.23369: variable 'port1_profile' from source: play vars 15896 1727203869.23431: variable 'port1_profile' from source: play vars 15896 1727203869.23448: variable 'dhcp_interface1' from source: play vars 15896 1727203869.23515: variable 'dhcp_interface1' from source: play vars 15896 1727203869.23527: variable 'controller_profile' from source: play vars 15896 1727203869.23596: variable 'controller_profile' from source: play vars 15896 1727203869.23608: variable 'port2_profile' from source: play vars 15896 1727203869.23674: variable 'port2_profile' from source: play vars 15896 1727203869.23777: variable 'dhcp_interface2' from source: play vars 15896 1727203869.23780: variable 'dhcp_interface2' from source: play vars 15896 1727203869.23783: variable 'controller_profile' from source: play vars 15896 1727203869.23834: variable 'controller_profile' from source: play vars 15896 1727203869.23847: variable 'network_state' from source: role '' defaults 15896 1727203869.23922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203869.24107: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203869.24149: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203869.24188: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203869.24228: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203869.24297: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203869.24328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203869.24358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.24393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203869.24467: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15896 1727203869.24515: when evaluation is False, skipping this task 15896 1727203869.24518: _execute() done 15896 1727203869.24520: dumping result to json 15896 1727203869.24522: done dumping result, returning 15896 1727203869.24527: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-fb83-b6ad-00000000002a] 15896 1727203869.24537: sending task result for task 028d2410-947f-fb83-b6ad-00000000002a 15896 1727203869.24911: done sending task result for task 028d2410-947f-fb83-b6ad-00000000002a 15896 1727203869.24915: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15896 1727203869.24952: no more pending results, returning what we have 15896 1727203869.24955: results queue empty 15896 1727203869.24956: checking for any_errors_fatal 15896 1727203869.24962: done checking for any_errors_fatal 15896 1727203869.24963: checking for max_fail_percentage 15896 1727203869.24965: done checking for max_fail_percentage 15896 1727203869.24965: checking to see if all hosts have failed and the running result is not ok 15896 1727203869.24966: done checking to see if all hosts have failed 15896 1727203869.24967: getting the remaining hosts for this loop 15896 1727203869.24968: done getting the remaining hosts for this loop 15896 1727203869.24971: getting the next task for host managed-node1 15896 1727203869.24977: done getting next task for host managed-node1 15896 1727203869.24981: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203869.24983: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203869.24996: getting variables 15896 1727203869.24998: in VariableManager get_vars() 15896 1727203869.25042: Calling all_inventory to load vars for managed-node1 15896 1727203869.25045: Calling groups_inventory to load vars for managed-node1 15896 1727203869.25047: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203869.25056: Calling all_plugins_play to load vars for managed-node1 15896 1727203869.25061: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203869.25065: Calling groups_plugins_play to load vars for managed-node1 15896 1727203869.27687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203869.29723: done with get_vars() 15896 1727203869.29749: done getting variables 15896 1727203869.29856: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:51:09 -0400 (0:00:00.125) 0:00:14.888 ***** 15896 1727203869.29893: entering _queue_task() for managed-node1/dnf 15896 1727203869.30245: worker is 1 (out of 1 available) 15896 1727203869.30263: exiting _queue_task() for managed-node1/dnf 15896 1727203869.30279: done queuing things up, now waiting for results queue to drain 15896 1727203869.30281: waiting for pending results... 15896 1727203869.30579: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203869.30783: in run() - task 028d2410-947f-fb83-b6ad-00000000002b 15896 1727203869.30793: variable 'ansible_search_path' from source: unknown 15896 1727203869.30795: variable 'ansible_search_path' from source: unknown 15896 1727203869.30798: calling self._execute() 15896 1727203869.30895: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203869.30910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203869.30921: variable 'omit' from source: magic vars 15896 1727203869.31535: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.31550: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203869.31752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203869.34382: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203869.34386: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203869.34389: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203869.34392: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203869.34394: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203869.34473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.34513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.34543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.34593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.34618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.34748: variable 'ansible_distribution' from source: facts 15896 1727203869.34762: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.34784: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15896 1727203869.34893: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203869.35019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.35049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.35085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.35125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.35143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.35191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.35214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.35239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.35284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.35301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.35340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.35382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.35401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.35490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.35493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.35623: variable 'network_connections' from source: task vars 15896 1727203869.35639: variable 'controller_profile' from source: play vars 15896 1727203869.35711: variable 'controller_profile' from source: play vars 15896 1727203869.35724: variable 'controller_device' from source: play vars 15896 1727203869.35789: variable 'controller_device' from source: play vars 15896 1727203869.35803: variable 'port1_profile' from source: play vars 15896 1727203869.35872: variable 'port1_profile' from source: play vars 15896 1727203869.35923: variable 'dhcp_interface1' from source: play vars 15896 1727203869.35957: variable 'dhcp_interface1' from source: play vars 15896 1727203869.35971: variable 'controller_profile' from source: play vars 15896 1727203869.36036: variable 'controller_profile' from source: play vars 15896 1727203869.36048: variable 'port2_profile' from source: play vars 15896 1727203869.36110: variable 'port2_profile' from source: play vars 15896 1727203869.36141: variable 'dhcp_interface2' from source: play vars 15896 1727203869.36198: variable 'dhcp_interface2' from source: play vars 15896 1727203869.36212: variable 'controller_profile' from source: play vars 15896 1727203869.36360: variable 'controller_profile' from source: play vars 15896 1727203869.36372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203869.36567: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203869.36616: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203869.36649: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203869.36695: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203869.36751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203869.36818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203869.36852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.36891: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203869.36981: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203869.37241: variable 'network_connections' from source: task vars 15896 1727203869.37251: variable 'controller_profile' from source: play vars 15896 1727203869.37321: variable 'controller_profile' from source: play vars 15896 1727203869.37333: variable 'controller_device' from source: play vars 15896 1727203869.37421: variable 'controller_device' from source: play vars 15896 1727203869.37424: variable 'port1_profile' from source: play vars 15896 1727203869.37474: variable 'port1_profile' from source: play vars 15896 1727203869.37492: variable 'dhcp_interface1' from source: play vars 15896 1727203869.37638: variable 'dhcp_interface1' from source: play vars 15896 1727203869.37641: variable 'controller_profile' from source: play vars 15896 1727203869.37644: variable 'controller_profile' from source: play vars 15896 1727203869.37653: variable 'port2_profile' from source: play vars 15896 1727203869.37725: variable 'port2_profile' from source: play vars 15896 1727203869.37737: variable 'dhcp_interface2' from source: play vars 15896 1727203869.37812: variable 'dhcp_interface2' from source: play vars 15896 1727203869.37830: variable 'controller_profile' from source: play vars 15896 1727203869.37907: variable 'controller_profile' from source: play vars 15896 1727203869.37947: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203869.37954: when evaluation is False, skipping this task 15896 1727203869.37967: _execute() done 15896 1727203869.37977: dumping result to json 15896 1727203869.37991: done dumping result, returning 15896 1727203869.38079: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-00000000002b] 15896 1727203869.38083: sending task result for task 028d2410-947f-fb83-b6ad-00000000002b 15896 1727203869.38153: done sending task result for task 028d2410-947f-fb83-b6ad-00000000002b 15896 1727203869.38157: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203869.38236: no more pending results, returning what we have 15896 1727203869.38240: results queue empty 15896 1727203869.38241: checking for any_errors_fatal 15896 1727203869.38249: done checking for any_errors_fatal 15896 1727203869.38249: checking for max_fail_percentage 15896 1727203869.38252: done checking for max_fail_percentage 15896 1727203869.38253: checking to see if all hosts have failed and the running result is not ok 15896 1727203869.38253: done checking to see if all hosts have failed 15896 1727203869.38254: getting the remaining hosts for this loop 15896 1727203869.38256: done getting the remaining hosts for this loop 15896 1727203869.38262: getting the next task for host managed-node1 15896 1727203869.38268: done getting next task for host managed-node1 15896 1727203869.38273: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203869.38277: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203869.38292: getting variables 15896 1727203869.38293: in VariableManager get_vars() 15896 1727203869.38350: Calling all_inventory to load vars for managed-node1 15896 1727203869.38353: Calling groups_inventory to load vars for managed-node1 15896 1727203869.38360: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203869.38371: Calling all_plugins_play to load vars for managed-node1 15896 1727203869.38374: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203869.38685: Calling groups_plugins_play to load vars for managed-node1 15896 1727203869.40176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203869.41967: done with get_vars() 15896 1727203869.41998: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203869.42089: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:51:09 -0400 (0:00:00.122) 0:00:15.010 ***** 15896 1727203869.42122: entering _queue_task() for managed-node1/yum 15896 1727203869.42123: Creating lock for yum 15896 1727203869.42515: worker is 1 (out of 1 available) 15896 1727203869.42534: exiting _queue_task() for managed-node1/yum 15896 1727203869.42547: done queuing things up, now waiting for results queue to drain 15896 1727203869.42549: waiting for pending results... 15896 1727203869.42994: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203869.42999: in run() - task 028d2410-947f-fb83-b6ad-00000000002c 15896 1727203869.43007: variable 'ansible_search_path' from source: unknown 15896 1727203869.43019: variable 'ansible_search_path' from source: unknown 15896 1727203869.43072: calling self._execute() 15896 1727203869.43174: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203869.43188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203869.43206: variable 'omit' from source: magic vars 15896 1727203869.43617: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.43643: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203869.43841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203869.46015: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203869.46159: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203869.46162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203869.46180: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203869.46209: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203869.46295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.46328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.46356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.46405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.46425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.46523: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.46542: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15896 1727203869.46548: when evaluation is False, skipping this task 15896 1727203869.46590: _execute() done 15896 1727203869.46593: dumping result to json 15896 1727203869.46595: done dumping result, returning 15896 1727203869.46598: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-00000000002c] 15896 1727203869.46600: sending task result for task 028d2410-947f-fb83-b6ad-00000000002c skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15896 1727203869.46925: no more pending results, returning what we have 15896 1727203869.46928: results queue empty 15896 1727203869.46929: checking for any_errors_fatal 15896 1727203869.46934: done checking for any_errors_fatal 15896 1727203869.46935: checking for max_fail_percentage 15896 1727203869.46937: done checking for max_fail_percentage 15896 1727203869.46937: checking to see if all hosts have failed and the running result is not ok 15896 1727203869.46938: done checking to see if all hosts have failed 15896 1727203869.46939: getting the remaining hosts for this loop 15896 1727203869.46940: done getting the remaining hosts for this loop 15896 1727203869.46944: getting the next task for host managed-node1 15896 1727203869.46950: done getting next task for host managed-node1 15896 1727203869.46953: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203869.46956: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203869.46970: getting variables 15896 1727203869.46972: in VariableManager get_vars() 15896 1727203869.47027: Calling all_inventory to load vars for managed-node1 15896 1727203869.47029: Calling groups_inventory to load vars for managed-node1 15896 1727203869.47032: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203869.47042: Calling all_plugins_play to load vars for managed-node1 15896 1727203869.47044: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203869.47047: Calling groups_plugins_play to load vars for managed-node1 15896 1727203869.47589: done sending task result for task 028d2410-947f-fb83-b6ad-00000000002c 15896 1727203869.47592: WORKER PROCESS EXITING 15896 1727203869.48667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203869.50344: done with get_vars() 15896 1727203869.50372: done getting variables 15896 1727203869.50441: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:51:09 -0400 (0:00:00.083) 0:00:15.094 ***** 15896 1727203869.50492: entering _queue_task() for managed-node1/fail 15896 1727203869.50924: worker is 1 (out of 1 available) 15896 1727203869.50939: exiting _queue_task() for managed-node1/fail 15896 1727203869.50952: done queuing things up, now waiting for results queue to drain 15896 1727203869.50954: waiting for pending results... 15896 1727203869.51255: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203869.51438: in run() - task 028d2410-947f-fb83-b6ad-00000000002d 15896 1727203869.51465: variable 'ansible_search_path' from source: unknown 15896 1727203869.51474: variable 'ansible_search_path' from source: unknown 15896 1727203869.51519: calling self._execute() 15896 1727203869.51619: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203869.51631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203869.51645: variable 'omit' from source: magic vars 15896 1727203869.52105: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.52125: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203869.52264: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203869.52490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203869.54662: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203869.54761: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203869.54849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203869.54852: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203869.54883: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203869.54994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.55034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.55178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.55182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.55184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.55205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.55232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.55258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.55308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.55326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.55378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.55416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.55445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.55492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.55515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.55734: variable 'network_connections' from source: task vars 15896 1727203869.55752: variable 'controller_profile' from source: play vars 15896 1727203869.55835: variable 'controller_profile' from source: play vars 15896 1727203869.55851: variable 'controller_device' from source: play vars 15896 1727203869.55918: variable 'controller_device' from source: play vars 15896 1727203869.56180: variable 'port1_profile' from source: play vars 15896 1727203869.56183: variable 'port1_profile' from source: play vars 15896 1727203869.56185: variable 'dhcp_interface1' from source: play vars 15896 1727203869.56187: variable 'dhcp_interface1' from source: play vars 15896 1727203869.56189: variable 'controller_profile' from source: play vars 15896 1727203869.56191: variable 'controller_profile' from source: play vars 15896 1727203869.56192: variable 'port2_profile' from source: play vars 15896 1727203869.56207: variable 'port2_profile' from source: play vars 15896 1727203869.56217: variable 'dhcp_interface2' from source: play vars 15896 1727203869.56281: variable 'dhcp_interface2' from source: play vars 15896 1727203869.56293: variable 'controller_profile' from source: play vars 15896 1727203869.56356: variable 'controller_profile' from source: play vars 15896 1727203869.56435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203869.56616: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203869.56664: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203869.56704: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203869.56740: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203869.56791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203869.56817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203869.56850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.56885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203869.56967: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203869.57235: variable 'network_connections' from source: task vars 15896 1727203869.57245: variable 'controller_profile' from source: play vars 15896 1727203869.57319: variable 'controller_profile' from source: play vars 15896 1727203869.57331: variable 'controller_device' from source: play vars 15896 1727203869.57411: variable 'controller_device' from source: play vars 15896 1727203869.57425: variable 'port1_profile' from source: play vars 15896 1727203869.57503: variable 'port1_profile' from source: play vars 15896 1727203869.57516: variable 'dhcp_interface1' from source: play vars 15896 1727203869.57584: variable 'dhcp_interface1' from source: play vars 15896 1727203869.57596: variable 'controller_profile' from source: play vars 15896 1727203869.57661: variable 'controller_profile' from source: play vars 15896 1727203869.57673: variable 'port2_profile' from source: play vars 15896 1727203869.57742: variable 'port2_profile' from source: play vars 15896 1727203869.57754: variable 'dhcp_interface2' from source: play vars 15896 1727203869.57935: variable 'dhcp_interface2' from source: play vars 15896 1727203869.57941: variable 'controller_profile' from source: play vars 15896 1727203869.57943: variable 'controller_profile' from source: play vars 15896 1727203869.57968: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203869.57979: when evaluation is False, skipping this task 15896 1727203869.57987: _execute() done 15896 1727203869.57995: dumping result to json 15896 1727203869.58003: done dumping result, returning 15896 1727203869.58020: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-00000000002d] 15896 1727203869.58029: sending task result for task 028d2410-947f-fb83-b6ad-00000000002d skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203869.58204: no more pending results, returning what we have 15896 1727203869.58208: results queue empty 15896 1727203869.58209: checking for any_errors_fatal 15896 1727203869.58215: done checking for any_errors_fatal 15896 1727203869.58216: checking for max_fail_percentage 15896 1727203869.58218: done checking for max_fail_percentage 15896 1727203869.58219: checking to see if all hosts have failed and the running result is not ok 15896 1727203869.58220: done checking to see if all hosts have failed 15896 1727203869.58220: getting the remaining hosts for this loop 15896 1727203869.58222: done getting the remaining hosts for this loop 15896 1727203869.58226: getting the next task for host managed-node1 15896 1727203869.58233: done getting next task for host managed-node1 15896 1727203869.58237: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15896 1727203869.58240: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203869.58256: getting variables 15896 1727203869.58258: in VariableManager get_vars() 15896 1727203869.58318: Calling all_inventory to load vars for managed-node1 15896 1727203869.58320: Calling groups_inventory to load vars for managed-node1 15896 1727203869.58322: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203869.58332: Calling all_plugins_play to load vars for managed-node1 15896 1727203869.58335: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203869.58337: Calling groups_plugins_play to load vars for managed-node1 15896 1727203869.58991: done sending task result for task 028d2410-947f-fb83-b6ad-00000000002d 15896 1727203869.58995: WORKER PROCESS EXITING 15896 1727203869.60078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203869.61686: done with get_vars() 15896 1727203869.61716: done getting variables 15896 1727203869.61776: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:51:09 -0400 (0:00:00.113) 0:00:15.207 ***** 15896 1727203869.61816: entering _queue_task() for managed-node1/package 15896 1727203869.62185: worker is 1 (out of 1 available) 15896 1727203869.62197: exiting _queue_task() for managed-node1/package 15896 1727203869.62210: done queuing things up, now waiting for results queue to drain 15896 1727203869.62212: waiting for pending results... 15896 1727203869.62535: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 15896 1727203869.62993: in run() - task 028d2410-947f-fb83-b6ad-00000000002e 15896 1727203869.63086: variable 'ansible_search_path' from source: unknown 15896 1727203869.63120: variable 'ansible_search_path' from source: unknown 15896 1727203869.63178: calling self._execute() 15896 1727203869.63313: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203869.63326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203869.63334: variable 'omit' from source: magic vars 15896 1727203869.63620: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.63638: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203869.63868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203869.64142: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203869.64180: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203869.64221: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203869.64243: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203869.64403: variable 'network_packages' from source: role '' defaults 15896 1727203869.64525: variable '__network_provider_setup' from source: role '' defaults 15896 1727203869.64533: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203869.64603: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203869.64627: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203869.64661: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203869.64804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203869.75905: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203869.75947: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203869.76126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203869.76384: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203869.76388: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203869.76391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.76404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.76430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.76471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.76691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.76733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.76758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.76784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.76879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.76883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.77270: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203869.77591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.77613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.77637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.77917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.77931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.78018: variable 'ansible_python' from source: facts 15896 1727203869.78045: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203869.78389: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203869.78546: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203869.78931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.78955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.78980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.79017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.79033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.79079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203869.79321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203869.79328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.79371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203869.79783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203869.79787: variable 'network_connections' from source: task vars 15896 1727203869.79789: variable 'controller_profile' from source: play vars 15896 1727203869.79867: variable 'controller_profile' from source: play vars 15896 1727203869.79871: variable 'controller_device' from source: play vars 15896 1727203869.80145: variable 'controller_device' from source: play vars 15896 1727203869.80161: variable 'port1_profile' from source: play vars 15896 1727203869.80255: variable 'port1_profile' from source: play vars 15896 1727203869.80261: variable 'dhcp_interface1' from source: play vars 15896 1727203869.80580: variable 'dhcp_interface1' from source: play vars 15896 1727203869.80584: variable 'controller_profile' from source: play vars 15896 1727203869.80683: variable 'controller_profile' from source: play vars 15896 1727203869.80898: variable 'port2_profile' from source: play vars 15896 1727203869.80978: variable 'port2_profile' from source: play vars 15896 1727203869.81069: variable 'dhcp_interface2' from source: play vars 15896 1727203869.81264: variable 'dhcp_interface2' from source: play vars 15896 1727203869.81340: variable 'controller_profile' from source: play vars 15896 1727203869.81499: variable 'controller_profile' from source: play vars 15896 1727203869.81688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203869.81724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203869.81759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203869.81797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203869.81842: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203869.82133: variable 'network_connections' from source: task vars 15896 1727203869.82143: variable 'controller_profile' from source: play vars 15896 1727203869.82247: variable 'controller_profile' from source: play vars 15896 1727203869.82264: variable 'controller_device' from source: play vars 15896 1727203869.82382: variable 'controller_device' from source: play vars 15896 1727203869.82385: variable 'port1_profile' from source: play vars 15896 1727203869.82480: variable 'port1_profile' from source: play vars 15896 1727203869.82582: variable 'dhcp_interface1' from source: play vars 15896 1727203869.82594: variable 'dhcp_interface1' from source: play vars 15896 1727203869.82608: variable 'controller_profile' from source: play vars 15896 1727203869.82710: variable 'controller_profile' from source: play vars 15896 1727203869.82726: variable 'port2_profile' from source: play vars 15896 1727203869.82824: variable 'port2_profile' from source: play vars 15896 1727203869.82838: variable 'dhcp_interface2' from source: play vars 15896 1727203869.82939: variable 'dhcp_interface2' from source: play vars 15896 1727203869.82952: variable 'controller_profile' from source: play vars 15896 1727203869.83052: variable 'controller_profile' from source: play vars 15896 1727203869.83118: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203869.83198: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203869.83503: variable 'network_connections' from source: task vars 15896 1727203869.83514: variable 'controller_profile' from source: play vars 15896 1727203869.83581: variable 'controller_profile' from source: play vars 15896 1727203869.83594: variable 'controller_device' from source: play vars 15896 1727203869.83684: variable 'controller_device' from source: play vars 15896 1727203869.83687: variable 'port1_profile' from source: play vars 15896 1727203869.83731: variable 'port1_profile' from source: play vars 15896 1727203869.83742: variable 'dhcp_interface1' from source: play vars 15896 1727203869.83805: variable 'dhcp_interface1' from source: play vars 15896 1727203869.83815: variable 'controller_profile' from source: play vars 15896 1727203869.83881: variable 'controller_profile' from source: play vars 15896 1727203869.84081: variable 'port2_profile' from source: play vars 15896 1727203869.84084: variable 'port2_profile' from source: play vars 15896 1727203869.84086: variable 'dhcp_interface2' from source: play vars 15896 1727203869.84087: variable 'dhcp_interface2' from source: play vars 15896 1727203869.84089: variable 'controller_profile' from source: play vars 15896 1727203869.84091: variable 'controller_profile' from source: play vars 15896 1727203869.84107: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203869.84183: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203869.84477: variable 'network_connections' from source: task vars 15896 1727203869.84489: variable 'controller_profile' from source: play vars 15896 1727203869.84553: variable 'controller_profile' from source: play vars 15896 1727203869.84566: variable 'controller_device' from source: play vars 15896 1727203869.84632: variable 'controller_device' from source: play vars 15896 1727203869.84646: variable 'port1_profile' from source: play vars 15896 1727203869.84713: variable 'port1_profile' from source: play vars 15896 1727203869.84726: variable 'dhcp_interface1' from source: play vars 15896 1727203869.84791: variable 'dhcp_interface1' from source: play vars 15896 1727203869.84803: variable 'controller_profile' from source: play vars 15896 1727203869.84866: variable 'controller_profile' from source: play vars 15896 1727203869.84882: variable 'port2_profile' from source: play vars 15896 1727203869.84944: variable 'port2_profile' from source: play vars 15896 1727203869.84958: variable 'dhcp_interface2' from source: play vars 15896 1727203869.85022: variable 'dhcp_interface2' from source: play vars 15896 1727203869.85034: variable 'controller_profile' from source: play vars 15896 1727203869.85103: variable 'controller_profile' from source: play vars 15896 1727203869.85174: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203869.85236: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203869.85248: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203869.85312: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203869.85529: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203869.85998: variable 'network_connections' from source: task vars 15896 1727203869.86007: variable 'controller_profile' from source: play vars 15896 1727203869.86065: variable 'controller_profile' from source: play vars 15896 1727203869.86088: variable 'controller_device' from source: play vars 15896 1727203869.86146: variable 'controller_device' from source: play vars 15896 1727203869.86159: variable 'port1_profile' from source: play vars 15896 1727203869.86221: variable 'port1_profile' from source: play vars 15896 1727203869.86236: variable 'dhcp_interface1' from source: play vars 15896 1727203869.86298: variable 'dhcp_interface1' from source: play vars 15896 1727203869.86340: variable 'controller_profile' from source: play vars 15896 1727203869.86368: variable 'controller_profile' from source: play vars 15896 1727203869.86381: variable 'port2_profile' from source: play vars 15896 1727203869.86440: variable 'port2_profile' from source: play vars 15896 1727203869.86455: variable 'dhcp_interface2' from source: play vars 15896 1727203869.86516: variable 'dhcp_interface2' from source: play vars 15896 1727203869.86556: variable 'controller_profile' from source: play vars 15896 1727203869.86593: variable 'controller_profile' from source: play vars 15896 1727203869.86606: variable 'ansible_distribution' from source: facts 15896 1727203869.86615: variable '__network_rh_distros' from source: role '' defaults 15896 1727203869.86624: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.86655: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203869.86880: variable 'ansible_distribution' from source: facts 15896 1727203869.86883: variable '__network_rh_distros' from source: role '' defaults 15896 1727203869.86885: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.86887: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203869.87017: variable 'ansible_distribution' from source: facts 15896 1727203869.87027: variable '__network_rh_distros' from source: role '' defaults 15896 1727203869.87038: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.87081: variable 'network_provider' from source: set_fact 15896 1727203869.87103: variable 'ansible_facts' from source: unknown 15896 1727203869.87758: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15896 1727203869.87767: when evaluation is False, skipping this task 15896 1727203869.87774: _execute() done 15896 1727203869.87783: dumping result to json 15896 1727203869.87790: done dumping result, returning 15896 1727203869.87801: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-fb83-b6ad-00000000002e] 15896 1727203869.87808: sending task result for task 028d2410-947f-fb83-b6ad-00000000002e 15896 1727203869.87916: done sending task result for task 028d2410-947f-fb83-b6ad-00000000002e 15896 1727203869.87923: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15896 1727203869.87972: no more pending results, returning what we have 15896 1727203869.87977: results queue empty 15896 1727203869.87978: checking for any_errors_fatal 15896 1727203869.87983: done checking for any_errors_fatal 15896 1727203869.87984: checking for max_fail_percentage 15896 1727203869.87986: done checking for max_fail_percentage 15896 1727203869.87987: checking to see if all hosts have failed and the running result is not ok 15896 1727203869.87987: done checking to see if all hosts have failed 15896 1727203869.87988: getting the remaining hosts for this loop 15896 1727203869.87990: done getting the remaining hosts for this loop 15896 1727203869.87994: getting the next task for host managed-node1 15896 1727203869.88000: done getting next task for host managed-node1 15896 1727203869.88004: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203869.88007: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203869.88021: getting variables 15896 1727203869.88024: in VariableManager get_vars() 15896 1727203869.88076: Calling all_inventory to load vars for managed-node1 15896 1727203869.88080: Calling groups_inventory to load vars for managed-node1 15896 1727203869.88082: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203869.88092: Calling all_plugins_play to load vars for managed-node1 15896 1727203869.88095: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203869.88098: Calling groups_plugins_play to load vars for managed-node1 15896 1727203869.93708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203869.95186: done with get_vars() 15896 1727203869.95213: done getting variables 15896 1727203869.95265: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:51:09 -0400 (0:00:00.334) 0:00:15.542 ***** 15896 1727203869.95298: entering _queue_task() for managed-node1/package 15896 1727203869.95635: worker is 1 (out of 1 available) 15896 1727203869.95646: exiting _queue_task() for managed-node1/package 15896 1727203869.95657: done queuing things up, now waiting for results queue to drain 15896 1727203869.95658: waiting for pending results... 15896 1727203869.95930: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203869.96079: in run() - task 028d2410-947f-fb83-b6ad-00000000002f 15896 1727203869.96103: variable 'ansible_search_path' from source: unknown 15896 1727203869.96111: variable 'ansible_search_path' from source: unknown 15896 1727203869.96149: calling self._execute() 15896 1727203869.96250: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203869.96281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203869.96284: variable 'omit' from source: magic vars 15896 1727203869.96669: variable 'ansible_distribution_major_version' from source: facts 15896 1727203869.96881: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203869.96885: variable 'network_state' from source: role '' defaults 15896 1727203869.96888: Evaluated conditional (network_state != {}): False 15896 1727203869.96890: when evaluation is False, skipping this task 15896 1727203869.96893: _execute() done 15896 1727203869.96895: dumping result to json 15896 1727203869.96897: done dumping result, returning 15896 1727203869.96900: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-fb83-b6ad-00000000002f] 15896 1727203869.96903: sending task result for task 028d2410-947f-fb83-b6ad-00000000002f 15896 1727203869.96982: done sending task result for task 028d2410-947f-fb83-b6ad-00000000002f skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203869.97034: no more pending results, returning what we have 15896 1727203869.97038: results queue empty 15896 1727203869.97039: checking for any_errors_fatal 15896 1727203869.97047: done checking for any_errors_fatal 15896 1727203869.97048: checking for max_fail_percentage 15896 1727203869.97049: done checking for max_fail_percentage 15896 1727203869.97050: checking to see if all hosts have failed and the running result is not ok 15896 1727203869.97052: done checking to see if all hosts have failed 15896 1727203869.97053: getting the remaining hosts for this loop 15896 1727203869.97055: done getting the remaining hosts for this loop 15896 1727203869.97058: getting the next task for host managed-node1 15896 1727203869.97067: done getting next task for host managed-node1 15896 1727203869.97071: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203869.97074: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203869.97094: getting variables 15896 1727203869.97096: in VariableManager get_vars() 15896 1727203869.97155: Calling all_inventory to load vars for managed-node1 15896 1727203869.97158: Calling groups_inventory to load vars for managed-node1 15896 1727203869.97161: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203869.97174: Calling all_plugins_play to load vars for managed-node1 15896 1727203869.97384: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203869.97390: Calling groups_plugins_play to load vars for managed-node1 15896 1727203869.98089: WORKER PROCESS EXITING 15896 1727203869.98788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203870.01087: done with get_vars() 15896 1727203870.01114: done getting variables 15896 1727203870.01177: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:51:10 -0400 (0:00:00.059) 0:00:15.601 ***** 15896 1727203870.01212: entering _queue_task() for managed-node1/package 15896 1727203870.01536: worker is 1 (out of 1 available) 15896 1727203870.01547: exiting _queue_task() for managed-node1/package 15896 1727203870.01559: done queuing things up, now waiting for results queue to drain 15896 1727203870.01561: waiting for pending results... 15896 1727203870.01844: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203870.01968: in run() - task 028d2410-947f-fb83-b6ad-000000000030 15896 1727203870.01990: variable 'ansible_search_path' from source: unknown 15896 1727203870.02003: variable 'ansible_search_path' from source: unknown 15896 1727203870.02085: calling self._execute() 15896 1727203870.02196: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203870.02208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203870.02225: variable 'omit' from source: magic vars 15896 1727203870.02702: variable 'ansible_distribution_major_version' from source: facts 15896 1727203870.02719: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203870.02844: variable 'network_state' from source: role '' defaults 15896 1727203870.02863: Evaluated conditional (network_state != {}): False 15896 1727203870.02877: when evaluation is False, skipping this task 15896 1727203870.02885: _execute() done 15896 1727203870.02893: dumping result to json 15896 1727203870.02901: done dumping result, returning 15896 1727203870.02912: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-fb83-b6ad-000000000030] 15896 1727203870.02921: sending task result for task 028d2410-947f-fb83-b6ad-000000000030 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203870.03173: no more pending results, returning what we have 15896 1727203870.03179: results queue empty 15896 1727203870.03181: checking for any_errors_fatal 15896 1727203870.03190: done checking for any_errors_fatal 15896 1727203870.03191: checking for max_fail_percentage 15896 1727203870.03193: done checking for max_fail_percentage 15896 1727203870.03195: checking to see if all hosts have failed and the running result is not ok 15896 1727203870.03195: done checking to see if all hosts have failed 15896 1727203870.03196: getting the remaining hosts for this loop 15896 1727203870.03198: done getting the remaining hosts for this loop 15896 1727203870.03201: getting the next task for host managed-node1 15896 1727203870.03209: done getting next task for host managed-node1 15896 1727203870.03214: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203870.03218: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203870.03235: getting variables 15896 1727203870.03237: in VariableManager get_vars() 15896 1727203870.03485: Calling all_inventory to load vars for managed-node1 15896 1727203870.03488: Calling groups_inventory to load vars for managed-node1 15896 1727203870.03490: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203870.03497: done sending task result for task 028d2410-947f-fb83-b6ad-000000000030 15896 1727203870.03500: WORKER PROCESS EXITING 15896 1727203870.03510: Calling all_plugins_play to load vars for managed-node1 15896 1727203870.03513: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203870.03516: Calling groups_plugins_play to load vars for managed-node1 15896 1727203870.05889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203870.07727: done with get_vars() 15896 1727203870.07780: done getting variables 15896 1727203870.07879: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:51:10 -0400 (0:00:00.066) 0:00:15.668 ***** 15896 1727203870.07910: entering _queue_task() for managed-node1/service 15896 1727203870.07912: Creating lock for service 15896 1727203870.08263: worker is 1 (out of 1 available) 15896 1727203870.08479: exiting _queue_task() for managed-node1/service 15896 1727203870.08490: done queuing things up, now waiting for results queue to drain 15896 1727203870.08491: waiting for pending results... 15896 1727203870.08571: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203870.08713: in run() - task 028d2410-947f-fb83-b6ad-000000000031 15896 1727203870.08830: variable 'ansible_search_path' from source: unknown 15896 1727203870.08835: variable 'ansible_search_path' from source: unknown 15896 1727203870.08838: calling self._execute() 15896 1727203870.08882: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203870.08894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203870.08908: variable 'omit' from source: magic vars 15896 1727203870.09285: variable 'ansible_distribution_major_version' from source: facts 15896 1727203870.09304: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203870.09434: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203870.10027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203870.12740: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203870.12799: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203870.12828: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203870.12854: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203870.12879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203870.12937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203870.12961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203870.12981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.13009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203870.13020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203870.13054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203870.13071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203870.13093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.13120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203870.13130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203870.13161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203870.13178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203870.13197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.13224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203870.13235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203870.13355: variable 'network_connections' from source: task vars 15896 1727203870.13367: variable 'controller_profile' from source: play vars 15896 1727203870.13421: variable 'controller_profile' from source: play vars 15896 1727203870.13430: variable 'controller_device' from source: play vars 15896 1727203870.13474: variable 'controller_device' from source: play vars 15896 1727203870.13484: variable 'port1_profile' from source: play vars 15896 1727203870.13528: variable 'port1_profile' from source: play vars 15896 1727203870.13535: variable 'dhcp_interface1' from source: play vars 15896 1727203870.13578: variable 'dhcp_interface1' from source: play vars 15896 1727203870.13583: variable 'controller_profile' from source: play vars 15896 1727203870.13627: variable 'controller_profile' from source: play vars 15896 1727203870.13630: variable 'port2_profile' from source: play vars 15896 1727203870.13677: variable 'port2_profile' from source: play vars 15896 1727203870.13697: variable 'dhcp_interface2' from source: play vars 15896 1727203870.13737: variable 'dhcp_interface2' from source: play vars 15896 1727203870.13745: variable 'controller_profile' from source: play vars 15896 1727203870.13791: variable 'controller_profile' from source: play vars 15896 1727203870.13989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203870.14038: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203870.14077: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203870.14104: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203870.14134: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203870.14181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203870.14217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203870.14228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.14254: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203870.14378: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203870.14560: variable 'network_connections' from source: task vars 15896 1727203870.14570: variable 'controller_profile' from source: play vars 15896 1727203870.14629: variable 'controller_profile' from source: play vars 15896 1727203870.14632: variable 'controller_device' from source: play vars 15896 1727203870.14825: variable 'controller_device' from source: play vars 15896 1727203870.14838: variable 'port1_profile' from source: play vars 15896 1727203870.15004: variable 'port1_profile' from source: play vars 15896 1727203870.15007: variable 'dhcp_interface1' from source: play vars 15896 1727203870.15009: variable 'dhcp_interface1' from source: play vars 15896 1727203870.15012: variable 'controller_profile' from source: play vars 15896 1727203870.15074: variable 'controller_profile' from source: play vars 15896 1727203870.15092: variable 'port2_profile' from source: play vars 15896 1727203870.15242: variable 'port2_profile' from source: play vars 15896 1727203870.15245: variable 'dhcp_interface2' from source: play vars 15896 1727203870.15247: variable 'dhcp_interface2' from source: play vars 15896 1727203870.15252: variable 'controller_profile' from source: play vars 15896 1727203870.15313: variable 'controller_profile' from source: play vars 15896 1727203870.15363: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203870.15370: when evaluation is False, skipping this task 15896 1727203870.15378: _execute() done 15896 1727203870.15386: dumping result to json 15896 1727203870.15393: done dumping result, returning 15896 1727203870.15404: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000031] 15896 1727203870.15413: sending task result for task 028d2410-947f-fb83-b6ad-000000000031 15896 1727203870.15781: done sending task result for task 028d2410-947f-fb83-b6ad-000000000031 15896 1727203870.15785: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203870.15834: no more pending results, returning what we have 15896 1727203870.15837: results queue empty 15896 1727203870.15838: checking for any_errors_fatal 15896 1727203870.15843: done checking for any_errors_fatal 15896 1727203870.15844: checking for max_fail_percentage 15896 1727203870.15845: done checking for max_fail_percentage 15896 1727203870.15846: checking to see if all hosts have failed and the running result is not ok 15896 1727203870.15847: done checking to see if all hosts have failed 15896 1727203870.15847: getting the remaining hosts for this loop 15896 1727203870.15849: done getting the remaining hosts for this loop 15896 1727203870.15853: getting the next task for host managed-node1 15896 1727203870.15859: done getting next task for host managed-node1 15896 1727203870.15863: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203870.15866: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203870.15882: getting variables 15896 1727203870.15883: in VariableManager get_vars() 15896 1727203870.15946: Calling all_inventory to load vars for managed-node1 15896 1727203870.15949: Calling groups_inventory to load vars for managed-node1 15896 1727203870.15951: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203870.15961: Calling all_plugins_play to load vars for managed-node1 15896 1727203870.15964: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203870.15967: Calling groups_plugins_play to load vars for managed-node1 15896 1727203870.16884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203870.17872: done with get_vars() 15896 1727203870.17890: done getting variables 15896 1727203870.17933: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:51:10 -0400 (0:00:00.100) 0:00:15.768 ***** 15896 1727203870.17957: entering _queue_task() for managed-node1/service 15896 1727203870.18195: worker is 1 (out of 1 available) 15896 1727203870.18209: exiting _queue_task() for managed-node1/service 15896 1727203870.18220: done queuing things up, now waiting for results queue to drain 15896 1727203870.18222: waiting for pending results... 15896 1727203870.18411: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203870.18511: in run() - task 028d2410-947f-fb83-b6ad-000000000032 15896 1727203870.18522: variable 'ansible_search_path' from source: unknown 15896 1727203870.18529: variable 'ansible_search_path' from source: unknown 15896 1727203870.18574: calling self._execute() 15896 1727203870.18685: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203870.18689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203870.18692: variable 'omit' from source: magic vars 15896 1727203870.19181: variable 'ansible_distribution_major_version' from source: facts 15896 1727203870.19185: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203870.19235: variable 'network_provider' from source: set_fact 15896 1727203870.19246: variable 'network_state' from source: role '' defaults 15896 1727203870.19263: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15896 1727203870.19278: variable 'omit' from source: magic vars 15896 1727203870.19332: variable 'omit' from source: magic vars 15896 1727203870.19366: variable 'network_service_name' from source: role '' defaults 15896 1727203870.19434: variable 'network_service_name' from source: role '' defaults 15896 1727203870.19556: variable '__network_provider_setup' from source: role '' defaults 15896 1727203870.19572: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203870.19641: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203870.19654: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203870.19725: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203870.20268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203870.22305: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203870.22356: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203870.22390: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203870.22415: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203870.22434: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203870.22498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203870.22518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203870.22535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.22562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203870.22578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203870.22612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203870.22628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203870.22645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.22675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203870.22686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203870.22838: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203870.22919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203870.22936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203870.22953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.22981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203870.22992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203870.23056: variable 'ansible_python' from source: facts 15896 1727203870.23077: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203870.23133: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203870.23190: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203870.23480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203870.23484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203870.23487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.23489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203870.23491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203870.23493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203870.23566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203870.23648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.23796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203870.24188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203870.24244: variable 'network_connections' from source: task vars 15896 1727203870.24258: variable 'controller_profile' from source: play vars 15896 1727203870.24354: variable 'controller_profile' from source: play vars 15896 1727203870.24374: variable 'controller_device' from source: play vars 15896 1727203870.24468: variable 'controller_device' from source: play vars 15896 1727203870.24489: variable 'port1_profile' from source: play vars 15896 1727203870.24577: variable 'port1_profile' from source: play vars 15896 1727203870.24599: variable 'dhcp_interface1' from source: play vars 15896 1727203870.24689: variable 'dhcp_interface1' from source: play vars 15896 1727203870.24703: variable 'controller_profile' from source: play vars 15896 1727203870.24794: variable 'controller_profile' from source: play vars 15896 1727203870.24807: variable 'port2_profile' from source: play vars 15896 1727203870.24896: variable 'port2_profile' from source: play vars 15896 1727203870.24915: variable 'dhcp_interface2' from source: play vars 15896 1727203870.24999: variable 'dhcp_interface2' from source: play vars 15896 1727203870.25008: variable 'controller_profile' from source: play vars 15896 1727203870.25072: variable 'controller_profile' from source: play vars 15896 1727203870.25142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203870.25296: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203870.25330: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203870.25388: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203870.25421: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203870.25473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203870.25503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203870.25538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203870.25563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203870.25618: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203870.25889: variable 'network_connections' from source: task vars 15896 1727203870.25904: variable 'controller_profile' from source: play vars 15896 1727203870.25992: variable 'controller_profile' from source: play vars 15896 1727203870.26015: variable 'controller_device' from source: play vars 15896 1727203870.26157: variable 'controller_device' from source: play vars 15896 1727203870.26162: variable 'port1_profile' from source: play vars 15896 1727203870.26209: variable 'port1_profile' from source: play vars 15896 1727203870.26225: variable 'dhcp_interface1' from source: play vars 15896 1727203870.26310: variable 'dhcp_interface1' from source: play vars 15896 1727203870.26381: variable 'controller_profile' from source: play vars 15896 1727203870.26411: variable 'controller_profile' from source: play vars 15896 1727203870.26428: variable 'port2_profile' from source: play vars 15896 1727203870.26519: variable 'port2_profile' from source: play vars 15896 1727203870.26535: variable 'dhcp_interface2' from source: play vars 15896 1727203870.26622: variable 'dhcp_interface2' from source: play vars 15896 1727203870.26630: variable 'controller_profile' from source: play vars 15896 1727203870.26702: variable 'controller_profile' from source: play vars 15896 1727203870.26745: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203870.26805: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203870.27001: variable 'network_connections' from source: task vars 15896 1727203870.27005: variable 'controller_profile' from source: play vars 15896 1727203870.27055: variable 'controller_profile' from source: play vars 15896 1727203870.27064: variable 'controller_device' from source: play vars 15896 1727203870.27113: variable 'controller_device' from source: play vars 15896 1727203870.27120: variable 'port1_profile' from source: play vars 15896 1727203870.27172: variable 'port1_profile' from source: play vars 15896 1727203870.27180: variable 'dhcp_interface1' from source: play vars 15896 1727203870.27227: variable 'dhcp_interface1' from source: play vars 15896 1727203870.27232: variable 'controller_profile' from source: play vars 15896 1727203870.27286: variable 'controller_profile' from source: play vars 15896 1727203870.27292: variable 'port2_profile' from source: play vars 15896 1727203870.27339: variable 'port2_profile' from source: play vars 15896 1727203870.27345: variable 'dhcp_interface2' from source: play vars 15896 1727203870.27400: variable 'dhcp_interface2' from source: play vars 15896 1727203870.27406: variable 'controller_profile' from source: play vars 15896 1727203870.27453: variable 'controller_profile' from source: play vars 15896 1727203870.27480: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203870.27532: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203870.27730: variable 'network_connections' from source: task vars 15896 1727203870.27733: variable 'controller_profile' from source: play vars 15896 1727203870.27785: variable 'controller_profile' from source: play vars 15896 1727203870.27791: variable 'controller_device' from source: play vars 15896 1727203870.27842: variable 'controller_device' from source: play vars 15896 1727203870.27849: variable 'port1_profile' from source: play vars 15896 1727203870.27901: variable 'port1_profile' from source: play vars 15896 1727203870.27907: variable 'dhcp_interface1' from source: play vars 15896 1727203870.27957: variable 'dhcp_interface1' from source: play vars 15896 1727203870.27965: variable 'controller_profile' from source: play vars 15896 1727203870.28013: variable 'controller_profile' from source: play vars 15896 1727203870.28023: variable 'port2_profile' from source: play vars 15896 1727203870.28072: variable 'port2_profile' from source: play vars 15896 1727203870.28079: variable 'dhcp_interface2' from source: play vars 15896 1727203870.28125: variable 'dhcp_interface2' from source: play vars 15896 1727203870.28134: variable 'controller_profile' from source: play vars 15896 1727203870.28185: variable 'controller_profile' from source: play vars 15896 1727203870.28229: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203870.28279: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203870.28284: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203870.28325: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203870.28463: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203870.28769: variable 'network_connections' from source: task vars 15896 1727203870.28772: variable 'controller_profile' from source: play vars 15896 1727203870.28820: variable 'controller_profile' from source: play vars 15896 1727203870.28826: variable 'controller_device' from source: play vars 15896 1727203870.28868: variable 'controller_device' from source: play vars 15896 1727203870.28877: variable 'port1_profile' from source: play vars 15896 1727203870.28921: variable 'port1_profile' from source: play vars 15896 1727203870.28927: variable 'dhcp_interface1' from source: play vars 15896 1727203870.28970: variable 'dhcp_interface1' from source: play vars 15896 1727203870.28974: variable 'controller_profile' from source: play vars 15896 1727203870.29019: variable 'controller_profile' from source: play vars 15896 1727203870.29025: variable 'port2_profile' from source: play vars 15896 1727203870.29068: variable 'port2_profile' from source: play vars 15896 1727203870.29074: variable 'dhcp_interface2' from source: play vars 15896 1727203870.29118: variable 'dhcp_interface2' from source: play vars 15896 1727203870.29121: variable 'controller_profile' from source: play vars 15896 1727203870.29163: variable 'controller_profile' from source: play vars 15896 1727203870.29171: variable 'ansible_distribution' from source: facts 15896 1727203870.29174: variable '__network_rh_distros' from source: role '' defaults 15896 1727203870.29181: variable 'ansible_distribution_major_version' from source: facts 15896 1727203870.29202: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203870.29381: variable 'ansible_distribution' from source: facts 15896 1727203870.29384: variable '__network_rh_distros' from source: role '' defaults 15896 1727203870.29386: variable 'ansible_distribution_major_version' from source: facts 15896 1727203870.29473: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203870.29612: variable 'ansible_distribution' from source: facts 15896 1727203870.29622: variable '__network_rh_distros' from source: role '' defaults 15896 1727203870.29709: variable 'ansible_distribution_major_version' from source: facts 15896 1727203870.29713: variable 'network_provider' from source: set_fact 15896 1727203870.29715: variable 'omit' from source: magic vars 15896 1727203870.29738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203870.29769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203870.29792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203870.29814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203870.29840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203870.29943: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203870.29946: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203870.29949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203870.29998: Set connection var ansible_shell_type to sh 15896 1727203870.30011: Set connection var ansible_connection to ssh 15896 1727203870.30022: Set connection var ansible_shell_executable to /bin/sh 15896 1727203870.30051: Set connection var ansible_pipelining to False 15896 1727203870.30054: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203870.30061: Set connection var ansible_timeout to 10 15896 1727203870.30156: variable 'ansible_shell_executable' from source: unknown 15896 1727203870.30159: variable 'ansible_connection' from source: unknown 15896 1727203870.30161: variable 'ansible_module_compression' from source: unknown 15896 1727203870.30163: variable 'ansible_shell_type' from source: unknown 15896 1727203870.30166: variable 'ansible_shell_executable' from source: unknown 15896 1727203870.30168: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203870.30170: variable 'ansible_pipelining' from source: unknown 15896 1727203870.30172: variable 'ansible_timeout' from source: unknown 15896 1727203870.30173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203870.30244: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203870.30252: variable 'omit' from source: magic vars 15896 1727203870.30257: starting attempt loop 15896 1727203870.30265: running the handler 15896 1727203870.30345: variable 'ansible_facts' from source: unknown 15896 1727203870.30821: _low_level_execute_command(): starting 15896 1727203870.30829: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203870.31325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203870.31330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203870.31332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.31374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203870.31389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203870.31485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203870.33278: stdout chunk (state=3): >>>/root <<< 15896 1727203870.33379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203870.33408: stderr chunk (state=3): >>><<< 15896 1727203870.33411: stdout chunk (state=3): >>><<< 15896 1727203870.33429: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203870.33443: _low_level_execute_command(): starting 15896 1727203870.33450: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236 `" && echo ansible-tmp-1727203870.3342965-17411-64703395451236="` echo /root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236 `" ) && sleep 0' 15896 1727203870.33911: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203870.33915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.33918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.33967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203870.33970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203870.33972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203870.34058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203870.36177: stdout chunk (state=3): >>>ansible-tmp-1727203870.3342965-17411-64703395451236=/root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236 <<< 15896 1727203870.36289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203870.36316: stderr chunk (state=3): >>><<< 15896 1727203870.36319: stdout chunk (state=3): >>><<< 15896 1727203870.36332: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203870.3342965-17411-64703395451236=/root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203870.36359: variable 'ansible_module_compression' from source: unknown 15896 1727203870.36409: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15896 1727203870.36412: ANSIBALLZ: Acquiring lock 15896 1727203870.36415: ANSIBALLZ: Lock acquired: 140082272719056 15896 1727203870.36417: ANSIBALLZ: Creating module 15896 1727203870.55879: ANSIBALLZ: Writing module into payload 15896 1727203870.55989: ANSIBALLZ: Writing module 15896 1727203870.56010: ANSIBALLZ: Renaming module 15896 1727203870.56016: ANSIBALLZ: Done creating module 15896 1727203870.56048: variable 'ansible_facts' from source: unknown 15896 1727203870.56194: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/AnsiballZ_systemd.py 15896 1727203870.56301: Sending initial data 15896 1727203870.56304: Sent initial data (155 bytes) 15896 1727203870.56737: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203870.56775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203870.56781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.56783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203870.56785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203870.56787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.56833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203870.56836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203870.56838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203870.56924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203870.58704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203870.58781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203870.58857: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpslv6kx2_ /root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/AnsiballZ_systemd.py <<< 15896 1727203870.58865: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/AnsiballZ_systemd.py" <<< 15896 1727203870.58931: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpslv6kx2_" to remote "/root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/AnsiballZ_systemd.py" <<< 15896 1727203870.58934: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/AnsiballZ_systemd.py" <<< 15896 1727203870.60197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203870.60242: stderr chunk (state=3): >>><<< 15896 1727203870.60246: stdout chunk (state=3): >>><<< 15896 1727203870.60257: done transferring module to remote 15896 1727203870.60269: _low_level_execute_command(): starting 15896 1727203870.60272: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/ /root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/AnsiballZ_systemd.py && sleep 0' 15896 1727203870.60710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203870.60714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.60725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.60777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203870.60781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203870.60867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203870.62896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203870.62917: stderr chunk (state=3): >>><<< 15896 1727203870.62925: stdout chunk (state=3): >>><<< 15896 1727203870.62945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203870.62953: _low_level_execute_command(): starting 15896 1727203870.63037: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/AnsiballZ_systemd.py && sleep 0' 15896 1727203870.63541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203870.63544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203870.63547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.63598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203870.63628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203870.63644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203870.63773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203870.94888: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10502144", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3289210880", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "644294000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 15896 1727203870.94902: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network<<< 15896 1727203870.94917: stdout chunk (state=3): >>>-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15896 1727203870.97112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203870.97140: stderr chunk (state=3): >>><<< 15896 1727203870.97143: stdout chunk (state=3): >>><<< 15896 1727203870.97162: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10502144", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3289210880", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "644294000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203870.97302: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203870.97319: _low_level_execute_command(): starting 15896 1727203870.97322: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203870.3342965-17411-64703395451236/ > /dev/null 2>&1 && sleep 0' 15896 1727203870.97770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203870.97801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203870.97804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.97806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203870.97808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203870.97810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203870.97867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203870.97874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203870.97880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203870.97949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203871.00024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203871.00028: stdout chunk (state=3): >>><<< 15896 1727203871.00031: stderr chunk (state=3): >>><<< 15896 1727203871.00072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203871.00078: handler run complete 15896 1727203871.00173: attempt loop complete, returning result 15896 1727203871.00179: _execute() done 15896 1727203871.00182: dumping result to json 15896 1727203871.00184: done dumping result, returning 15896 1727203871.00186: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-fb83-b6ad-000000000032] 15896 1727203871.00188: sending task result for task 028d2410-947f-fb83-b6ad-000000000032 15896 1727203871.00614: done sending task result for task 028d2410-947f-fb83-b6ad-000000000032 15896 1727203871.00617: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203871.00682: no more pending results, returning what we have 15896 1727203871.00685: results queue empty 15896 1727203871.00685: checking for any_errors_fatal 15896 1727203871.00690: done checking for any_errors_fatal 15896 1727203871.00691: checking for max_fail_percentage 15896 1727203871.00693: done checking for max_fail_percentage 15896 1727203871.00694: checking to see if all hosts have failed and the running result is not ok 15896 1727203871.00694: done checking to see if all hosts have failed 15896 1727203871.00695: getting the remaining hosts for this loop 15896 1727203871.00696: done getting the remaining hosts for this loop 15896 1727203871.00699: getting the next task for host managed-node1 15896 1727203871.00705: done getting next task for host managed-node1 15896 1727203871.00708: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203871.00710: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203871.00719: getting variables 15896 1727203871.00721: in VariableManager get_vars() 15896 1727203871.00766: Calling all_inventory to load vars for managed-node1 15896 1727203871.00769: Calling groups_inventory to load vars for managed-node1 15896 1727203871.00772: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203871.00791: Calling all_plugins_play to load vars for managed-node1 15896 1727203871.00794: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203871.00798: Calling groups_plugins_play to load vars for managed-node1 15896 1727203871.01987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203871.02908: done with get_vars() 15896 1727203871.02926: done getting variables 15896 1727203871.02969: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:51:11 -0400 (0:00:00.850) 0:00:16.619 ***** 15896 1727203871.02996: entering _queue_task() for managed-node1/service 15896 1727203871.03236: worker is 1 (out of 1 available) 15896 1727203871.03250: exiting _queue_task() for managed-node1/service 15896 1727203871.03261: done queuing things up, now waiting for results queue to drain 15896 1727203871.03263: waiting for pending results... 15896 1727203871.03457: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203871.03550: in run() - task 028d2410-947f-fb83-b6ad-000000000033 15896 1727203871.03780: variable 'ansible_search_path' from source: unknown 15896 1727203871.03784: variable 'ansible_search_path' from source: unknown 15896 1727203871.03787: calling self._execute() 15896 1727203871.03794: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203871.03797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203871.03800: variable 'omit' from source: magic vars 15896 1727203871.04031: variable 'ansible_distribution_major_version' from source: facts 15896 1727203871.04040: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203871.04124: variable 'network_provider' from source: set_fact 15896 1727203871.04130: Evaluated conditional (network_provider == "nm"): True 15896 1727203871.04198: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203871.04260: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203871.04379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203871.06836: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203871.06887: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203871.06914: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203871.06940: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203871.06962: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203871.07022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203871.07044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203871.07063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203871.07095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203871.07106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203871.07139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203871.07154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203871.07172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203871.07202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203871.07214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203871.07241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203871.07256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203871.07274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203871.07302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203871.07312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203871.07409: variable 'network_connections' from source: task vars 15896 1727203871.07420: variable 'controller_profile' from source: play vars 15896 1727203871.07472: variable 'controller_profile' from source: play vars 15896 1727203871.07482: variable 'controller_device' from source: play vars 15896 1727203871.07526: variable 'controller_device' from source: play vars 15896 1727203871.07534: variable 'port1_profile' from source: play vars 15896 1727203871.07577: variable 'port1_profile' from source: play vars 15896 1727203871.07583: variable 'dhcp_interface1' from source: play vars 15896 1727203871.07629: variable 'dhcp_interface1' from source: play vars 15896 1727203871.07632: variable 'controller_profile' from source: play vars 15896 1727203871.07672: variable 'controller_profile' from source: play vars 15896 1727203871.07678: variable 'port2_profile' from source: play vars 15896 1727203871.07719: variable 'port2_profile' from source: play vars 15896 1727203871.07725: variable 'dhcp_interface2' from source: play vars 15896 1727203871.07770: variable 'dhcp_interface2' from source: play vars 15896 1727203871.07777: variable 'controller_profile' from source: play vars 15896 1727203871.07830: variable 'controller_profile' from source: play vars 15896 1727203871.07886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203871.07999: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203871.08026: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203871.08047: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203871.08071: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203871.08104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203871.08119: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203871.08136: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203871.08160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203871.08200: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203871.08386: variable 'network_connections' from source: task vars 15896 1727203871.08390: variable 'controller_profile' from source: play vars 15896 1727203871.08481: variable 'controller_profile' from source: play vars 15896 1727203871.08484: variable 'controller_device' from source: play vars 15896 1727203871.08487: variable 'controller_device' from source: play vars 15896 1727203871.08500: variable 'port1_profile' from source: play vars 15896 1727203871.08558: variable 'port1_profile' from source: play vars 15896 1727203871.08570: variable 'dhcp_interface1' from source: play vars 15896 1727203871.08631: variable 'dhcp_interface1' from source: play vars 15896 1727203871.08642: variable 'controller_profile' from source: play vars 15896 1727203871.08703: variable 'controller_profile' from source: play vars 15896 1727203871.08714: variable 'port2_profile' from source: play vars 15896 1727203871.08772: variable 'port2_profile' from source: play vars 15896 1727203871.08787: variable 'dhcp_interface2' from source: play vars 15896 1727203871.08846: variable 'dhcp_interface2' from source: play vars 15896 1727203871.08984: variable 'controller_profile' from source: play vars 15896 1727203871.08987: variable 'controller_profile' from source: play vars 15896 1727203871.08989: Evaluated conditional (__network_wpa_supplicant_required): False 15896 1727203871.08991: when evaluation is False, skipping this task 15896 1727203871.08993: _execute() done 15896 1727203871.08995: dumping result to json 15896 1727203871.08997: done dumping result, returning 15896 1727203871.08999: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-fb83-b6ad-000000000033] 15896 1727203871.09002: sending task result for task 028d2410-947f-fb83-b6ad-000000000033 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15896 1727203871.09147: no more pending results, returning what we have 15896 1727203871.09150: results queue empty 15896 1727203871.09151: checking for any_errors_fatal 15896 1727203871.09169: done checking for any_errors_fatal 15896 1727203871.09170: checking for max_fail_percentage 15896 1727203871.09172: done checking for max_fail_percentage 15896 1727203871.09173: checking to see if all hosts have failed and the running result is not ok 15896 1727203871.09173: done checking to see if all hosts have failed 15896 1727203871.09174: getting the remaining hosts for this loop 15896 1727203871.09177: done getting the remaining hosts for this loop 15896 1727203871.09181: getting the next task for host managed-node1 15896 1727203871.09188: done getting next task for host managed-node1 15896 1727203871.09192: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203871.09195: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203871.09209: getting variables 15896 1727203871.09210: in VariableManager get_vars() 15896 1727203871.09262: Calling all_inventory to load vars for managed-node1 15896 1727203871.09265: Calling groups_inventory to load vars for managed-node1 15896 1727203871.09267: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203871.09496: Calling all_plugins_play to load vars for managed-node1 15896 1727203871.09501: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203871.09506: done sending task result for task 028d2410-947f-fb83-b6ad-000000000033 15896 1727203871.09509: WORKER PROCESS EXITING 15896 1727203871.09513: Calling groups_plugins_play to load vars for managed-node1 15896 1727203871.11017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203871.14463: done with get_vars() 15896 1727203871.14497: done getting variables 15896 1727203871.14671: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:51:11 -0400 (0:00:00.117) 0:00:16.736 ***** 15896 1727203871.14706: entering _queue_task() for managed-node1/service 15896 1727203871.15206: worker is 1 (out of 1 available) 15896 1727203871.15219: exiting _queue_task() for managed-node1/service 15896 1727203871.15231: done queuing things up, now waiting for results queue to drain 15896 1727203871.15232: waiting for pending results... 15896 1727203871.15538: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203871.15770: in run() - task 028d2410-947f-fb83-b6ad-000000000034 15896 1727203871.15774: variable 'ansible_search_path' from source: unknown 15896 1727203871.15780: variable 'ansible_search_path' from source: unknown 15896 1727203871.15783: calling self._execute() 15896 1727203871.15852: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203871.15867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203871.15893: variable 'omit' from source: magic vars 15896 1727203871.16291: variable 'ansible_distribution_major_version' from source: facts 15896 1727203871.16419: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203871.16442: variable 'network_provider' from source: set_fact 15896 1727203871.16452: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203871.16462: when evaluation is False, skipping this task 15896 1727203871.16469: _execute() done 15896 1727203871.16478: dumping result to json 15896 1727203871.16486: done dumping result, returning 15896 1727203871.16497: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-fb83-b6ad-000000000034] 15896 1727203871.16507: sending task result for task 028d2410-947f-fb83-b6ad-000000000034 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203871.16682: no more pending results, returning what we have 15896 1727203871.16686: results queue empty 15896 1727203871.16686: checking for any_errors_fatal 15896 1727203871.16695: done checking for any_errors_fatal 15896 1727203871.16696: checking for max_fail_percentage 15896 1727203871.16698: done checking for max_fail_percentage 15896 1727203871.16699: checking to see if all hosts have failed and the running result is not ok 15896 1727203871.16700: done checking to see if all hosts have failed 15896 1727203871.16700: getting the remaining hosts for this loop 15896 1727203871.16702: done getting the remaining hosts for this loop 15896 1727203871.16706: getting the next task for host managed-node1 15896 1727203871.16712: done getting next task for host managed-node1 15896 1727203871.16716: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203871.16719: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203871.16735: getting variables 15896 1727203871.16848: in VariableManager get_vars() 15896 1727203871.16909: Calling all_inventory to load vars for managed-node1 15896 1727203871.16912: Calling groups_inventory to load vars for managed-node1 15896 1727203871.16914: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203871.16926: Calling all_plugins_play to load vars for managed-node1 15896 1727203871.16930: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203871.16933: Calling groups_plugins_play to load vars for managed-node1 15896 1727203871.17527: done sending task result for task 028d2410-947f-fb83-b6ad-000000000034 15896 1727203871.17531: WORKER PROCESS EXITING 15896 1727203871.18546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203871.20142: done with get_vars() 15896 1727203871.20161: done getting variables 15896 1727203871.20207: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:51:11 -0400 (0:00:00.055) 0:00:16.791 ***** 15896 1727203871.20232: entering _queue_task() for managed-node1/copy 15896 1727203871.20489: worker is 1 (out of 1 available) 15896 1727203871.20503: exiting _queue_task() for managed-node1/copy 15896 1727203871.20515: done queuing things up, now waiting for results queue to drain 15896 1727203871.20517: waiting for pending results... 15896 1727203871.20694: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203871.20790: in run() - task 028d2410-947f-fb83-b6ad-000000000035 15896 1727203871.20801: variable 'ansible_search_path' from source: unknown 15896 1727203871.20805: variable 'ansible_search_path' from source: unknown 15896 1727203871.20833: calling self._execute() 15896 1727203871.20909: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203871.20913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203871.20921: variable 'omit' from source: magic vars 15896 1727203871.21197: variable 'ansible_distribution_major_version' from source: facts 15896 1727203871.21206: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203871.21287: variable 'network_provider' from source: set_fact 15896 1727203871.21291: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203871.21294: when evaluation is False, skipping this task 15896 1727203871.21296: _execute() done 15896 1727203871.21299: dumping result to json 15896 1727203871.21301: done dumping result, returning 15896 1727203871.21313: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-fb83-b6ad-000000000035] 15896 1727203871.21315: sending task result for task 028d2410-947f-fb83-b6ad-000000000035 15896 1727203871.21407: done sending task result for task 028d2410-947f-fb83-b6ad-000000000035 15896 1727203871.21412: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203871.21463: no more pending results, returning what we have 15896 1727203871.21466: results queue empty 15896 1727203871.21467: checking for any_errors_fatal 15896 1727203871.21473: done checking for any_errors_fatal 15896 1727203871.21473: checking for max_fail_percentage 15896 1727203871.21476: done checking for max_fail_percentage 15896 1727203871.21477: checking to see if all hosts have failed and the running result is not ok 15896 1727203871.21478: done checking to see if all hosts have failed 15896 1727203871.21479: getting the remaining hosts for this loop 15896 1727203871.21480: done getting the remaining hosts for this loop 15896 1727203871.21483: getting the next task for host managed-node1 15896 1727203871.21489: done getting next task for host managed-node1 15896 1727203871.21493: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203871.21496: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203871.21510: getting variables 15896 1727203871.21512: in VariableManager get_vars() 15896 1727203871.21560: Calling all_inventory to load vars for managed-node1 15896 1727203871.21563: Calling groups_inventory to load vars for managed-node1 15896 1727203871.21565: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203871.21573: Calling all_plugins_play to load vars for managed-node1 15896 1727203871.21582: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203871.21585: Calling groups_plugins_play to load vars for managed-node1 15896 1727203871.22924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203871.23797: done with get_vars() 15896 1727203871.23817: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:51:11 -0400 (0:00:00.036) 0:00:16.828 ***** 15896 1727203871.23888: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203871.23889: Creating lock for fedora.linux_system_roles.network_connections 15896 1727203871.24198: worker is 1 (out of 1 available) 15896 1727203871.24211: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203871.24224: done queuing things up, now waiting for results queue to drain 15896 1727203871.24226: waiting for pending results... 15896 1727203871.24374: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203871.24448: in run() - task 028d2410-947f-fb83-b6ad-000000000036 15896 1727203871.24465: variable 'ansible_search_path' from source: unknown 15896 1727203871.24469: variable 'ansible_search_path' from source: unknown 15896 1727203871.24497: calling self._execute() 15896 1727203871.24573: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203871.24579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203871.24583: variable 'omit' from source: magic vars 15896 1727203871.25081: variable 'ansible_distribution_major_version' from source: facts 15896 1727203871.25084: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203871.25086: variable 'omit' from source: magic vars 15896 1727203871.25088: variable 'omit' from source: magic vars 15896 1727203871.25132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203871.26996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203871.27046: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203871.27078: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203871.27103: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203871.27123: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203871.27186: variable 'network_provider' from source: set_fact 15896 1727203871.27287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203871.27318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203871.27334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203871.27368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203871.27372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203871.27427: variable 'omit' from source: magic vars 15896 1727203871.27510: variable 'omit' from source: magic vars 15896 1727203871.27580: variable 'network_connections' from source: task vars 15896 1727203871.27591: variable 'controller_profile' from source: play vars 15896 1727203871.27634: variable 'controller_profile' from source: play vars 15896 1727203871.27640: variable 'controller_device' from source: play vars 15896 1727203871.27682: variable 'controller_device' from source: play vars 15896 1727203871.27691: variable 'port1_profile' from source: play vars 15896 1727203871.27734: variable 'port1_profile' from source: play vars 15896 1727203871.27739: variable 'dhcp_interface1' from source: play vars 15896 1727203871.27783: variable 'dhcp_interface1' from source: play vars 15896 1727203871.27789: variable 'controller_profile' from source: play vars 15896 1727203871.27833: variable 'controller_profile' from source: play vars 15896 1727203871.27839: variable 'port2_profile' from source: play vars 15896 1727203871.27882: variable 'port2_profile' from source: play vars 15896 1727203871.27888: variable 'dhcp_interface2' from source: play vars 15896 1727203871.27933: variable 'dhcp_interface2' from source: play vars 15896 1727203871.27939: variable 'controller_profile' from source: play vars 15896 1727203871.27982: variable 'controller_profile' from source: play vars 15896 1727203871.28137: variable 'omit' from source: magic vars 15896 1727203871.28140: variable '__lsr_ansible_managed' from source: task vars 15896 1727203871.28202: variable '__lsr_ansible_managed' from source: task vars 15896 1727203871.28362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15896 1727203871.28863: Loaded config def from plugin (lookup/template) 15896 1727203871.29081: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15896 1727203871.29084: File lookup term: get_ansible_managed.j2 15896 1727203871.29086: variable 'ansible_search_path' from source: unknown 15896 1727203871.29089: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15896 1727203871.29093: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15896 1727203871.29095: variable 'ansible_search_path' from source: unknown 15896 1727203871.35852: variable 'ansible_managed' from source: unknown 15896 1727203871.36040: variable 'omit' from source: magic vars 15896 1727203871.36081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203871.36112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203871.36178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203871.36231: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203871.36279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203871.36329: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203871.36339: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203871.36349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203871.36491: Set connection var ansible_shell_type to sh 15896 1727203871.36504: Set connection var ansible_connection to ssh 15896 1727203871.36514: Set connection var ansible_shell_executable to /bin/sh 15896 1727203871.36523: Set connection var ansible_pipelining to False 15896 1727203871.36532: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203871.36551: Set connection var ansible_timeout to 10 15896 1727203871.36614: variable 'ansible_shell_executable' from source: unknown 15896 1727203871.36705: variable 'ansible_connection' from source: unknown 15896 1727203871.36722: variable 'ansible_module_compression' from source: unknown 15896 1727203871.36730: variable 'ansible_shell_type' from source: unknown 15896 1727203871.36737: variable 'ansible_shell_executable' from source: unknown 15896 1727203871.36744: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203871.36881: variable 'ansible_pipelining' from source: unknown 15896 1727203871.36884: variable 'ansible_timeout' from source: unknown 15896 1727203871.36886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203871.37066: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203871.37137: variable 'omit' from source: magic vars 15896 1727203871.37149: starting attempt loop 15896 1727203871.37155: running the handler 15896 1727203871.37173: _low_level_execute_command(): starting 15896 1727203871.37190: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203871.38591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203871.38887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203871.38986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203871.40764: stdout chunk (state=3): >>>/root <<< 15896 1727203871.40958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203871.40962: stdout chunk (state=3): >>><<< 15896 1727203871.40964: stderr chunk (state=3): >>><<< 15896 1727203871.41182: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203871.41186: _low_level_execute_command(): starting 15896 1727203871.41190: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281 `" && echo ansible-tmp-1727203871.4111767-17522-134497042328281="` echo /root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281 `" ) && sleep 0' 15896 1727203871.41962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203871.41978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203871.42251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203871.42256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203871.42258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203871.42299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203871.42314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203871.42514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203871.44693: stdout chunk (state=3): >>>ansible-tmp-1727203871.4111767-17522-134497042328281=/root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281 <<< 15896 1727203871.44807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203871.44826: stderr chunk (state=3): >>><<< 15896 1727203871.44829: stdout chunk (state=3): >>><<< 15896 1727203871.44870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203871.4111767-17522-134497042328281=/root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203871.44891: variable 'ansible_module_compression' from source: unknown 15896 1727203871.44940: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15896 1727203871.44948: ANSIBALLZ: Acquiring lock 15896 1727203871.44951: ANSIBALLZ: Lock acquired: 140082272959520 15896 1727203871.44953: ANSIBALLZ: Creating module 15896 1727203871.65379: ANSIBALLZ: Writing module into payload 15896 1727203871.65602: ANSIBALLZ: Writing module 15896 1727203871.65624: ANSIBALLZ: Renaming module 15896 1727203871.65628: ANSIBALLZ: Done creating module 15896 1727203871.65652: variable 'ansible_facts' from source: unknown 15896 1727203871.65720: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/AnsiballZ_network_connections.py 15896 1727203871.65823: Sending initial data 15896 1727203871.65826: Sent initial data (168 bytes) 15896 1727203871.66281: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203871.66316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203871.66319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203871.66322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203871.66324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203871.66326: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203871.66379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203871.66382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203871.66387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203871.66470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203871.68208: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203871.68287: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203871.68371: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpsrffzayt /root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/AnsiballZ_network_connections.py <<< 15896 1727203871.68375: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/AnsiballZ_network_connections.py" <<< 15896 1727203871.68443: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpsrffzayt" to remote "/root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/AnsiballZ_network_connections.py" <<< 15896 1727203871.68447: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/AnsiballZ_network_connections.py" <<< 15896 1727203871.69301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203871.69349: stderr chunk (state=3): >>><<< 15896 1727203871.69352: stdout chunk (state=3): >>><<< 15896 1727203871.69392: done transferring module to remote 15896 1727203871.69401: _low_level_execute_command(): starting 15896 1727203871.69406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/ /root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/AnsiballZ_network_connections.py && sleep 0' 15896 1727203871.69845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203871.69849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203871.69887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203871.69890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203871.69893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203871.69895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203871.69950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203871.69953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203871.69959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203871.70036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203871.71976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203871.72003: stderr chunk (state=3): >>><<< 15896 1727203871.72006: stdout chunk (state=3): >>><<< 15896 1727203871.72020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203871.72023: _low_level_execute_command(): starting 15896 1727203871.72033: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/AnsiballZ_network_connections.py && sleep 0' 15896 1727203871.72489: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203871.72505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203871.72561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203871.72564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203871.72569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203871.72655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203872.16799: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, cf994329-c7c7-4568-8772-d142c724631d\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, cf994329-c7c7-4568-8772-d142c724631d (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15896 1727203872.19597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203872.19705: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 15896 1727203872.19709: stdout chunk (state=3): >>><<< 15896 1727203872.19711: stderr chunk (state=3): >>><<< 15896 1727203872.19714: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, cf994329-c7c7-4568-8772-d142c724631d\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, cf994329-c7c7-4568-8772-d142c724631d (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203872.19747: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203872.19825: _low_level_execute_command(): starting 15896 1727203872.19836: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203871.4111767-17522-134497042328281/ > /dev/null 2>&1 && sleep 0' 15896 1727203872.21194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203872.21249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203872.21396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203872.21510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203872.21605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203872.23749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203872.23765: stdout chunk (state=3): >>><<< 15896 1727203872.23786: stderr chunk (state=3): >>><<< 15896 1727203872.23990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203872.23994: handler run complete 15896 1727203872.23996: attempt loop complete, returning result 15896 1727203872.23998: _execute() done 15896 1727203872.24000: dumping result to json 15896 1727203872.24002: done dumping result, returning 15896 1727203872.24003: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-fb83-b6ad-000000000036] 15896 1727203872.24005: sending task result for task 028d2410-947f-fb83-b6ad-000000000036 15896 1727203872.24560: done sending task result for task 028d2410-947f-fb83-b6ad-000000000036 15896 1727203872.24564: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, cf994329-c7c7-4568-8772-d142c724631d [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, cf994329-c7c7-4568-8772-d142c724631d (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active) 15896 1727203872.24707: no more pending results, returning what we have 15896 1727203872.24710: results queue empty 15896 1727203872.24711: checking for any_errors_fatal 15896 1727203872.24719: done checking for any_errors_fatal 15896 1727203872.24720: checking for max_fail_percentage 15896 1727203872.24722: done checking for max_fail_percentage 15896 1727203872.24723: checking to see if all hosts have failed and the running result is not ok 15896 1727203872.24724: done checking to see if all hosts have failed 15896 1727203872.24724: getting the remaining hosts for this loop 15896 1727203872.24726: done getting the remaining hosts for this loop 15896 1727203872.24730: getting the next task for host managed-node1 15896 1727203872.24737: done getting next task for host managed-node1 15896 1727203872.24741: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203872.24744: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203872.24756: getting variables 15896 1727203872.24761: in VariableManager get_vars() 15896 1727203872.25288: Calling all_inventory to load vars for managed-node1 15896 1727203872.25291: Calling groups_inventory to load vars for managed-node1 15896 1727203872.25294: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203872.25303: Calling all_plugins_play to load vars for managed-node1 15896 1727203872.25306: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203872.25309: Calling groups_plugins_play to load vars for managed-node1 15896 1727203872.26400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203872.27262: done with get_vars() 15896 1727203872.27352: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:51:12 -0400 (0:00:01.035) 0:00:17.864 ***** 15896 1727203872.27469: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203872.27471: Creating lock for fedora.linux_system_roles.network_state 15896 1727203872.28093: worker is 1 (out of 1 available) 15896 1727203872.28107: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203872.28142: done queuing things up, now waiting for results queue to drain 15896 1727203872.28144: waiting for pending results... 15896 1727203872.28996: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203872.29001: in run() - task 028d2410-947f-fb83-b6ad-000000000037 15896 1727203872.29004: variable 'ansible_search_path' from source: unknown 15896 1727203872.29020: variable 'ansible_search_path' from source: unknown 15896 1727203872.29067: calling self._execute() 15896 1727203872.29167: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.29278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.29295: variable 'omit' from source: magic vars 15896 1727203872.29713: variable 'ansible_distribution_major_version' from source: facts 15896 1727203872.29729: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203872.29861: variable 'network_state' from source: role '' defaults 15896 1727203872.29879: Evaluated conditional (network_state != {}): False 15896 1727203872.29887: when evaluation is False, skipping this task 15896 1727203872.29893: _execute() done 15896 1727203872.29900: dumping result to json 15896 1727203872.29906: done dumping result, returning 15896 1727203872.29917: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-fb83-b6ad-000000000037] 15896 1727203872.29926: sending task result for task 028d2410-947f-fb83-b6ad-000000000037 15896 1727203872.30029: done sending task result for task 028d2410-947f-fb83-b6ad-000000000037 15896 1727203872.30037: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203872.30126: no more pending results, returning what we have 15896 1727203872.30130: results queue empty 15896 1727203872.30130: checking for any_errors_fatal 15896 1727203872.30363: done checking for any_errors_fatal 15896 1727203872.30365: checking for max_fail_percentage 15896 1727203872.30367: done checking for max_fail_percentage 15896 1727203872.30368: checking to see if all hosts have failed and the running result is not ok 15896 1727203872.30369: done checking to see if all hosts have failed 15896 1727203872.30369: getting the remaining hosts for this loop 15896 1727203872.30371: done getting the remaining hosts for this loop 15896 1727203872.30374: getting the next task for host managed-node1 15896 1727203872.30382: done getting next task for host managed-node1 15896 1727203872.30386: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203872.30389: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203872.30403: getting variables 15896 1727203872.30405: in VariableManager get_vars() 15896 1727203872.30450: Calling all_inventory to load vars for managed-node1 15896 1727203872.30453: Calling groups_inventory to load vars for managed-node1 15896 1727203872.30455: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203872.30466: Calling all_plugins_play to load vars for managed-node1 15896 1727203872.30469: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203872.30472: Calling groups_plugins_play to load vars for managed-node1 15896 1727203872.31912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203872.33525: done with get_vars() 15896 1727203872.33555: done getting variables 15896 1727203872.33619: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:51:12 -0400 (0:00:00.061) 0:00:17.925 ***** 15896 1727203872.33652: entering _queue_task() for managed-node1/debug 15896 1727203872.34032: worker is 1 (out of 1 available) 15896 1727203872.34044: exiting _queue_task() for managed-node1/debug 15896 1727203872.34055: done queuing things up, now waiting for results queue to drain 15896 1727203872.34057: waiting for pending results... 15896 1727203872.34357: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203872.34681: in run() - task 028d2410-947f-fb83-b6ad-000000000038 15896 1727203872.34685: variable 'ansible_search_path' from source: unknown 15896 1727203872.34688: variable 'ansible_search_path' from source: unknown 15896 1727203872.34691: calling self._execute() 15896 1727203872.34693: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.34696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.34698: variable 'omit' from source: magic vars 15896 1727203872.35099: variable 'ansible_distribution_major_version' from source: facts 15896 1727203872.35116: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203872.35127: variable 'omit' from source: magic vars 15896 1727203872.35193: variable 'omit' from source: magic vars 15896 1727203872.35233: variable 'omit' from source: magic vars 15896 1727203872.35288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203872.35326: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203872.35353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203872.35468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203872.35472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203872.35474: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203872.35478: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.35480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.35552: Set connection var ansible_shell_type to sh 15896 1727203872.35569: Set connection var ansible_connection to ssh 15896 1727203872.35587: Set connection var ansible_shell_executable to /bin/sh 15896 1727203872.35597: Set connection var ansible_pipelining to False 15896 1727203872.35606: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203872.35615: Set connection var ansible_timeout to 10 15896 1727203872.35641: variable 'ansible_shell_executable' from source: unknown 15896 1727203872.35649: variable 'ansible_connection' from source: unknown 15896 1727203872.35656: variable 'ansible_module_compression' from source: unknown 15896 1727203872.35667: variable 'ansible_shell_type' from source: unknown 15896 1727203872.35674: variable 'ansible_shell_executable' from source: unknown 15896 1727203872.35684: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.35696: variable 'ansible_pipelining' from source: unknown 15896 1727203872.35702: variable 'ansible_timeout' from source: unknown 15896 1727203872.35708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.35854: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203872.35907: variable 'omit' from source: magic vars 15896 1727203872.35910: starting attempt loop 15896 1727203872.35912: running the handler 15896 1727203872.36027: variable '__network_connections_result' from source: set_fact 15896 1727203872.36096: handler run complete 15896 1727203872.36123: attempt loop complete, returning result 15896 1727203872.36234: _execute() done 15896 1727203872.36237: dumping result to json 15896 1727203872.36240: done dumping result, returning 15896 1727203872.36242: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-fb83-b6ad-000000000038] 15896 1727203872.36244: sending task result for task 028d2410-947f-fb83-b6ad-000000000038 15896 1727203872.36316: done sending task result for task 028d2410-947f-fb83-b6ad-000000000038 15896 1727203872.36319: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, cf994329-c7c7-4568-8772-d142c724631d", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, cf994329-c7c7-4568-8772-d142c724631d (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)" ] } 15896 1727203872.36408: no more pending results, returning what we have 15896 1727203872.36411: results queue empty 15896 1727203872.36412: checking for any_errors_fatal 15896 1727203872.36418: done checking for any_errors_fatal 15896 1727203872.36419: checking for max_fail_percentage 15896 1727203872.36421: done checking for max_fail_percentage 15896 1727203872.36422: checking to see if all hosts have failed and the running result is not ok 15896 1727203872.36422: done checking to see if all hosts have failed 15896 1727203872.36423: getting the remaining hosts for this loop 15896 1727203872.36425: done getting the remaining hosts for this loop 15896 1727203872.36428: getting the next task for host managed-node1 15896 1727203872.36435: done getting next task for host managed-node1 15896 1727203872.36439: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203872.36442: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203872.36453: getting variables 15896 1727203872.36455: in VariableManager get_vars() 15896 1727203872.36513: Calling all_inventory to load vars for managed-node1 15896 1727203872.36516: Calling groups_inventory to load vars for managed-node1 15896 1727203872.36519: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203872.36529: Calling all_plugins_play to load vars for managed-node1 15896 1727203872.36532: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203872.36535: Calling groups_plugins_play to load vars for managed-node1 15896 1727203872.39905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203872.43166: done with get_vars() 15896 1727203872.43204: done getting variables 15896 1727203872.43273: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:51:12 -0400 (0:00:00.098) 0:00:18.024 ***** 15896 1727203872.43516: entering _queue_task() for managed-node1/debug 15896 1727203872.44117: worker is 1 (out of 1 available) 15896 1727203872.44129: exiting _queue_task() for managed-node1/debug 15896 1727203872.44141: done queuing things up, now waiting for results queue to drain 15896 1727203872.44142: waiting for pending results... 15896 1727203872.44519: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203872.44936: in run() - task 028d2410-947f-fb83-b6ad-000000000039 15896 1727203872.44941: variable 'ansible_search_path' from source: unknown 15896 1727203872.44944: variable 'ansible_search_path' from source: unknown 15896 1727203872.44947: calling self._execute() 15896 1727203872.45114: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.45371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.45377: variable 'omit' from source: magic vars 15896 1727203872.46086: variable 'ansible_distribution_major_version' from source: facts 15896 1727203872.46113: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203872.46131: variable 'omit' from source: magic vars 15896 1727203872.46226: variable 'omit' from source: magic vars 15896 1727203872.46327: variable 'omit' from source: magic vars 15896 1727203872.46538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203872.46542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203872.46586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203872.46669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203872.46691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203872.46728: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203872.46762: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.46788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.47018: Set connection var ansible_shell_type to sh 15896 1727203872.47023: Set connection var ansible_connection to ssh 15896 1727203872.47033: Set connection var ansible_shell_executable to /bin/sh 15896 1727203872.47039: Set connection var ansible_pipelining to False 15896 1727203872.47045: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203872.47117: Set connection var ansible_timeout to 10 15896 1727203872.47280: variable 'ansible_shell_executable' from source: unknown 15896 1727203872.47283: variable 'ansible_connection' from source: unknown 15896 1727203872.47286: variable 'ansible_module_compression' from source: unknown 15896 1727203872.47288: variable 'ansible_shell_type' from source: unknown 15896 1727203872.47290: variable 'ansible_shell_executable' from source: unknown 15896 1727203872.47294: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.47296: variable 'ansible_pipelining' from source: unknown 15896 1727203872.47299: variable 'ansible_timeout' from source: unknown 15896 1727203872.47301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.47417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203872.47421: variable 'omit' from source: magic vars 15896 1727203872.47423: starting attempt loop 15896 1727203872.47425: running the handler 15896 1727203872.47427: variable '__network_connections_result' from source: set_fact 15896 1727203872.47488: variable '__network_connections_result' from source: set_fact 15896 1727203872.47780: handler run complete 15896 1727203872.47783: attempt loop complete, returning result 15896 1727203872.47785: _execute() done 15896 1727203872.47787: dumping result to json 15896 1727203872.47789: done dumping result, returning 15896 1727203872.47791: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-fb83-b6ad-000000000039] 15896 1727203872.47793: sending task result for task 028d2410-947f-fb83-b6ad-000000000039 15896 1727203872.47961: done sending task result for task 028d2410-947f-fb83-b6ad-000000000039 15896 1727203872.47964: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, cf994329-c7c7-4568-8772-d142c724631d\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, cf994329-c7c7-4568-8772-d142c724631d (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, cf994329-c7c7-4568-8772-d142c724631d", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, cf994329-c7c7-4568-8772-d142c724631d (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)" ] } } 15896 1727203872.48054: no more pending results, returning what we have 15896 1727203872.48056: results queue empty 15896 1727203872.48139: checking for any_errors_fatal 15896 1727203872.48146: done checking for any_errors_fatal 15896 1727203872.48147: checking for max_fail_percentage 15896 1727203872.48149: done checking for max_fail_percentage 15896 1727203872.48150: checking to see if all hosts have failed and the running result is not ok 15896 1727203872.48150: done checking to see if all hosts have failed 15896 1727203872.48151: getting the remaining hosts for this loop 15896 1727203872.48152: done getting the remaining hosts for this loop 15896 1727203872.48156: getting the next task for host managed-node1 15896 1727203872.48162: done getting next task for host managed-node1 15896 1727203872.48165: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203872.48168: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203872.48186: getting variables 15896 1727203872.48187: in VariableManager get_vars() 15896 1727203872.48231: Calling all_inventory to load vars for managed-node1 15896 1727203872.48233: Calling groups_inventory to load vars for managed-node1 15896 1727203872.48236: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203872.48244: Calling all_plugins_play to load vars for managed-node1 15896 1727203872.48247: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203872.48250: Calling groups_plugins_play to load vars for managed-node1 15896 1727203872.51005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203872.54337: done with get_vars() 15896 1727203872.54457: done getting variables 15896 1727203872.54524: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:51:12 -0400 (0:00:00.110) 0:00:18.134 ***** 15896 1727203872.54557: entering _queue_task() for managed-node1/debug 15896 1727203872.55181: worker is 1 (out of 1 available) 15896 1727203872.55194: exiting _queue_task() for managed-node1/debug 15896 1727203872.55208: done queuing things up, now waiting for results queue to drain 15896 1727203872.55210: waiting for pending results... 15896 1727203872.55894: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203872.56037: in run() - task 028d2410-947f-fb83-b6ad-00000000003a 15896 1727203872.56062: variable 'ansible_search_path' from source: unknown 15896 1727203872.56073: variable 'ansible_search_path' from source: unknown 15896 1727203872.56120: calling self._execute() 15896 1727203872.56381: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.56385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.56388: variable 'omit' from source: magic vars 15896 1727203872.56630: variable 'ansible_distribution_major_version' from source: facts 15896 1727203872.56646: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203872.56773: variable 'network_state' from source: role '' defaults 15896 1727203872.56792: Evaluated conditional (network_state != {}): False 15896 1727203872.56799: when evaluation is False, skipping this task 15896 1727203872.56805: _execute() done 15896 1727203872.56812: dumping result to json 15896 1727203872.56819: done dumping result, returning 15896 1727203872.56830: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-fb83-b6ad-00000000003a] 15896 1727203872.56839: sending task result for task 028d2410-947f-fb83-b6ad-00000000003a 15896 1727203872.56946: done sending task result for task 028d2410-947f-fb83-b6ad-00000000003a 15896 1727203872.56953: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 15896 1727203872.57012: no more pending results, returning what we have 15896 1727203872.57015: results queue empty 15896 1727203872.57016: checking for any_errors_fatal 15896 1727203872.57028: done checking for any_errors_fatal 15896 1727203872.57029: checking for max_fail_percentage 15896 1727203872.57031: done checking for max_fail_percentage 15896 1727203872.57032: checking to see if all hosts have failed and the running result is not ok 15896 1727203872.57032: done checking to see if all hosts have failed 15896 1727203872.57033: getting the remaining hosts for this loop 15896 1727203872.57035: done getting the remaining hosts for this loop 15896 1727203872.57038: getting the next task for host managed-node1 15896 1727203872.57045: done getting next task for host managed-node1 15896 1727203872.57050: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203872.57054: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203872.57069: getting variables 15896 1727203872.57070: in VariableManager get_vars() 15896 1727203872.57122: Calling all_inventory to load vars for managed-node1 15896 1727203872.57125: Calling groups_inventory to load vars for managed-node1 15896 1727203872.57127: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203872.57138: Calling all_plugins_play to load vars for managed-node1 15896 1727203872.57141: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203872.57143: Calling groups_plugins_play to load vars for managed-node1 15896 1727203872.58959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203872.61814: done with get_vars() 15896 1727203872.61847: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:51:12 -0400 (0:00:00.073) 0:00:18.208 ***** 15896 1727203872.61951: entering _queue_task() for managed-node1/ping 15896 1727203872.61952: Creating lock for ping 15896 1727203872.62331: worker is 1 (out of 1 available) 15896 1727203872.62346: exiting _queue_task() for managed-node1/ping 15896 1727203872.62357: done queuing things up, now waiting for results queue to drain 15896 1727203872.62359: waiting for pending results... 15896 1727203872.62662: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203872.62983: in run() - task 028d2410-947f-fb83-b6ad-00000000003b 15896 1727203872.62987: variable 'ansible_search_path' from source: unknown 15896 1727203872.62989: variable 'ansible_search_path' from source: unknown 15896 1727203872.62993: calling self._execute() 15896 1727203872.62995: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.62998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.63000: variable 'omit' from source: magic vars 15896 1727203872.63322: variable 'ansible_distribution_major_version' from source: facts 15896 1727203872.63332: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203872.63339: variable 'omit' from source: magic vars 15896 1727203872.63403: variable 'omit' from source: magic vars 15896 1727203872.63436: variable 'omit' from source: magic vars 15896 1727203872.63625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203872.63762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203872.63781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203872.63981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203872.63985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203872.63987: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203872.63990: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.63992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.63994: Set connection var ansible_shell_type to sh 15896 1727203872.63996: Set connection var ansible_connection to ssh 15896 1727203872.63997: Set connection var ansible_shell_executable to /bin/sh 15896 1727203872.63999: Set connection var ansible_pipelining to False 15896 1727203872.64001: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203872.64003: Set connection var ansible_timeout to 10 15896 1727203872.64181: variable 'ansible_shell_executable' from source: unknown 15896 1727203872.64185: variable 'ansible_connection' from source: unknown 15896 1727203872.64188: variable 'ansible_module_compression' from source: unknown 15896 1727203872.64191: variable 'ansible_shell_type' from source: unknown 15896 1727203872.64194: variable 'ansible_shell_executable' from source: unknown 15896 1727203872.64196: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203872.64199: variable 'ansible_pipelining' from source: unknown 15896 1727203872.64201: variable 'ansible_timeout' from source: unknown 15896 1727203872.64204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203872.64252: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203872.64263: variable 'omit' from source: magic vars 15896 1727203872.64272: starting attempt loop 15896 1727203872.64274: running the handler 15896 1727203872.64290: _low_level_execute_command(): starting 15896 1727203872.64299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203872.65040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203872.65118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203872.65184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203872.65249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203872.67043: stdout chunk (state=3): >>>/root <<< 15896 1727203872.67205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203872.67208: stdout chunk (state=3): >>><<< 15896 1727203872.67211: stderr chunk (state=3): >>><<< 15896 1727203872.67230: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203872.67250: _low_level_execute_command(): starting 15896 1727203872.67343: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796 `" && echo ansible-tmp-1727203872.672367-17728-205334707666796="` echo /root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796 `" ) && sleep 0' 15896 1727203872.67930: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203872.67944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203872.67960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203872.67981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203872.67997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203872.68008: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203872.68092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203872.68136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203872.68151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203872.68178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203872.68291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203872.70419: stdout chunk (state=3): >>>ansible-tmp-1727203872.672367-17728-205334707666796=/root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796 <<< 15896 1727203872.70592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203872.70596: stdout chunk (state=3): >>><<< 15896 1727203872.70599: stderr chunk (state=3): >>><<< 15896 1727203872.70619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203872.672367-17728-205334707666796=/root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203872.70785: variable 'ansible_module_compression' from source: unknown 15896 1727203872.70789: ANSIBALLZ: Using lock for ping 15896 1727203872.70791: ANSIBALLZ: Acquiring lock 15896 1727203872.70793: ANSIBALLZ: Lock acquired: 140082266721216 15896 1727203872.70795: ANSIBALLZ: Creating module 15896 1727203872.83954: ANSIBALLZ: Writing module into payload 15896 1727203872.84033: ANSIBALLZ: Writing module 15896 1727203872.84065: ANSIBALLZ: Renaming module 15896 1727203872.84081: ANSIBALLZ: Done creating module 15896 1727203872.84110: variable 'ansible_facts' from source: unknown 15896 1727203872.84187: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/AnsiballZ_ping.py 15896 1727203872.84403: Sending initial data 15896 1727203872.84406: Sent initial data (152 bytes) 15896 1727203872.85041: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203872.85157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203872.85192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203872.85310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203872.87095: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203872.87216: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203872.87298: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpn6aunlwj /root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/AnsiballZ_ping.py <<< 15896 1727203872.87323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/AnsiballZ_ping.py" <<< 15896 1727203872.87414: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpn6aunlwj" to remote "/root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/AnsiballZ_ping.py" <<< 15896 1727203872.88434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203872.88437: stdout chunk (state=3): >>><<< 15896 1727203872.88440: stderr chunk (state=3): >>><<< 15896 1727203872.88442: done transferring module to remote 15896 1727203872.88444: _low_level_execute_command(): starting 15896 1727203872.88446: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/ /root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/AnsiballZ_ping.py && sleep 0' 15896 1727203872.89096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203872.89138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203872.89154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203872.89180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203872.89312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203872.91373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203872.91610: stderr chunk (state=3): >>><<< 15896 1727203872.91614: stdout chunk (state=3): >>><<< 15896 1727203872.91617: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203872.91619: _low_level_execute_command(): starting 15896 1727203872.91622: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/AnsiballZ_ping.py && sleep 0' 15896 1727203872.92605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203872.92787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203872.92996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203872.93189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203873.09504: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15896 1727203873.11048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203873.11053: stdout chunk (state=3): >>><<< 15896 1727203873.11063: stderr chunk (state=3): >>><<< 15896 1727203873.11221: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203873.11225: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203873.11227: _low_level_execute_command(): starting 15896 1727203873.11229: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203872.672367-17728-205334707666796/ > /dev/null 2>&1 && sleep 0' 15896 1727203873.11856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203873.11891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203873.11951: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203873.12004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203873.12049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203873.12063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203873.12178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203873.14197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203873.14278: stderr chunk (state=3): >>><<< 15896 1727203873.14281: stdout chunk (state=3): >>><<< 15896 1727203873.14481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203873.14490: handler run complete 15896 1727203873.14492: attempt loop complete, returning result 15896 1727203873.14494: _execute() done 15896 1727203873.14496: dumping result to json 15896 1727203873.14499: done dumping result, returning 15896 1727203873.14501: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-fb83-b6ad-00000000003b] 15896 1727203873.14503: sending task result for task 028d2410-947f-fb83-b6ad-00000000003b 15896 1727203873.14573: done sending task result for task 028d2410-947f-fb83-b6ad-00000000003b 15896 1727203873.14579: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 15896 1727203873.14643: no more pending results, returning what we have 15896 1727203873.14647: results queue empty 15896 1727203873.14648: checking for any_errors_fatal 15896 1727203873.14655: done checking for any_errors_fatal 15896 1727203873.14656: checking for max_fail_percentage 15896 1727203873.14660: done checking for max_fail_percentage 15896 1727203873.14661: checking to see if all hosts have failed and the running result is not ok 15896 1727203873.14662: done checking to see if all hosts have failed 15896 1727203873.14662: getting the remaining hosts for this loop 15896 1727203873.14664: done getting the remaining hosts for this loop 15896 1727203873.14668: getting the next task for host managed-node1 15896 1727203873.14679: done getting next task for host managed-node1 15896 1727203873.14682: ^ task is: TASK: meta (role_complete) 15896 1727203873.14685: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203873.14700: getting variables 15896 1727203873.14702: in VariableManager get_vars() 15896 1727203873.14760: Calling all_inventory to load vars for managed-node1 15896 1727203873.14763: Calling groups_inventory to load vars for managed-node1 15896 1727203873.14766: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.14890: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.14895: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.14899: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.16594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.18284: done with get_vars() 15896 1727203873.18309: done getting variables 15896 1727203873.18393: done queuing things up, now waiting for results queue to drain 15896 1727203873.18395: results queue empty 15896 1727203873.18396: checking for any_errors_fatal 15896 1727203873.18399: done checking for any_errors_fatal 15896 1727203873.18400: checking for max_fail_percentage 15896 1727203873.18401: done checking for max_fail_percentage 15896 1727203873.18402: checking to see if all hosts have failed and the running result is not ok 15896 1727203873.18403: done checking to see if all hosts have failed 15896 1727203873.18403: getting the remaining hosts for this loop 15896 1727203873.18404: done getting the remaining hosts for this loop 15896 1727203873.18407: getting the next task for host managed-node1 15896 1727203873.18412: done getting next task for host managed-node1 15896 1727203873.18415: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15896 1727203873.18417: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203873.18419: getting variables 15896 1727203873.18420: in VariableManager get_vars() 15896 1727203873.18440: Calling all_inventory to load vars for managed-node1 15896 1727203873.18442: Calling groups_inventory to load vars for managed-node1 15896 1727203873.18444: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.18449: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.18452: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.18454: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.19523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.21020: done with get_vars() 15896 1727203873.21049: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:51:13 -0400 (0:00:00.591) 0:00:18.800 ***** 15896 1727203873.21130: entering _queue_task() for managed-node1/include_tasks 15896 1727203873.21485: worker is 1 (out of 1 available) 15896 1727203873.21497: exiting _queue_task() for managed-node1/include_tasks 15896 1727203873.21510: done queuing things up, now waiting for results queue to drain 15896 1727203873.21511: waiting for pending results... 15896 1727203873.21884: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 15896 1727203873.22002: in run() - task 028d2410-947f-fb83-b6ad-00000000006e 15896 1727203873.22006: variable 'ansible_search_path' from source: unknown 15896 1727203873.22008: variable 'ansible_search_path' from source: unknown 15896 1727203873.22011: calling self._execute() 15896 1727203873.22069: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.22085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.22104: variable 'omit' from source: magic vars 15896 1727203873.22494: variable 'ansible_distribution_major_version' from source: facts 15896 1727203873.22511: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203873.22521: _execute() done 15896 1727203873.22529: dumping result to json 15896 1727203873.22537: done dumping result, returning 15896 1727203873.22551: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-fb83-b6ad-00000000006e] 15896 1727203873.22560: sending task result for task 028d2410-947f-fb83-b6ad-00000000006e 15896 1727203873.22696: no more pending results, returning what we have 15896 1727203873.22701: in VariableManager get_vars() 15896 1727203873.22762: Calling all_inventory to load vars for managed-node1 15896 1727203873.22765: Calling groups_inventory to load vars for managed-node1 15896 1727203873.22767: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.22784: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.22787: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.22790: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.23689: done sending task result for task 028d2410-947f-fb83-b6ad-00000000006e 15896 1727203873.23693: WORKER PROCESS EXITING 15896 1727203873.24506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.25973: done with get_vars() 15896 1727203873.26002: variable 'ansible_search_path' from source: unknown 15896 1727203873.26003: variable 'ansible_search_path' from source: unknown 15896 1727203873.26044: we have included files to process 15896 1727203873.26046: generating all_blocks data 15896 1727203873.26048: done generating all_blocks data 15896 1727203873.26053: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15896 1727203873.26055: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15896 1727203873.26057: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15896 1727203873.26253: done processing included file 15896 1727203873.26255: iterating over new_blocks loaded from include file 15896 1727203873.26257: in VariableManager get_vars() 15896 1727203873.26287: done with get_vars() 15896 1727203873.26289: filtering new block on tags 15896 1727203873.26307: done filtering new block on tags 15896 1727203873.26309: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 15896 1727203873.26315: extending task lists for all hosts with included blocks 15896 1727203873.26414: done extending task lists 15896 1727203873.26415: done processing included files 15896 1727203873.26416: results queue empty 15896 1727203873.26417: checking for any_errors_fatal 15896 1727203873.26419: done checking for any_errors_fatal 15896 1727203873.26419: checking for max_fail_percentage 15896 1727203873.26421: done checking for max_fail_percentage 15896 1727203873.26421: checking to see if all hosts have failed and the running result is not ok 15896 1727203873.26422: done checking to see if all hosts have failed 15896 1727203873.26423: getting the remaining hosts for this loop 15896 1727203873.26425: done getting the remaining hosts for this loop 15896 1727203873.26427: getting the next task for host managed-node1 15896 1727203873.26431: done getting next task for host managed-node1 15896 1727203873.26434: ^ task is: TASK: Get stat for interface {{ interface }} 15896 1727203873.26437: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203873.26440: getting variables 15896 1727203873.26441: in VariableManager get_vars() 15896 1727203873.26460: Calling all_inventory to load vars for managed-node1 15896 1727203873.26463: Calling groups_inventory to load vars for managed-node1 15896 1727203873.26465: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.26470: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.26472: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.26477: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.27565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.29086: done with get_vars() 15896 1727203873.29111: done getting variables 15896 1727203873.29270: variable 'interface' from source: task vars 15896 1727203873.29275: variable 'controller_device' from source: play vars 15896 1727203873.29335: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:51:13 -0400 (0:00:00.082) 0:00:18.883 ***** 15896 1727203873.29367: entering _queue_task() for managed-node1/stat 15896 1727203873.29812: worker is 1 (out of 1 available) 15896 1727203873.29822: exiting _queue_task() for managed-node1/stat 15896 1727203873.29832: done queuing things up, now waiting for results queue to drain 15896 1727203873.29834: waiting for pending results... 15896 1727203873.30073: running TaskExecutor() for managed-node1/TASK: Get stat for interface nm-bond 15896 1727203873.30160: in run() - task 028d2410-947f-fb83-b6ad-000000000337 15896 1727203873.30190: variable 'ansible_search_path' from source: unknown 15896 1727203873.30380: variable 'ansible_search_path' from source: unknown 15896 1727203873.30384: calling self._execute() 15896 1727203873.30387: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.30389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.30391: variable 'omit' from source: magic vars 15896 1727203873.30726: variable 'ansible_distribution_major_version' from source: facts 15896 1727203873.30744: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203873.30756: variable 'omit' from source: magic vars 15896 1727203873.30818: variable 'omit' from source: magic vars 15896 1727203873.30924: variable 'interface' from source: task vars 15896 1727203873.30937: variable 'controller_device' from source: play vars 15896 1727203873.31003: variable 'controller_device' from source: play vars 15896 1727203873.31029: variable 'omit' from source: magic vars 15896 1727203873.31161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203873.31165: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203873.31168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203873.31170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203873.31189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203873.31223: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203873.31232: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.31241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.31348: Set connection var ansible_shell_type to sh 15896 1727203873.31362: Set connection var ansible_connection to ssh 15896 1727203873.31380: Set connection var ansible_shell_executable to /bin/sh 15896 1727203873.31391: Set connection var ansible_pipelining to False 15896 1727203873.31402: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203873.31412: Set connection var ansible_timeout to 10 15896 1727203873.31438: variable 'ansible_shell_executable' from source: unknown 15896 1727203873.31447: variable 'ansible_connection' from source: unknown 15896 1727203873.31457: variable 'ansible_module_compression' from source: unknown 15896 1727203873.31489: variable 'ansible_shell_type' from source: unknown 15896 1727203873.31492: variable 'ansible_shell_executable' from source: unknown 15896 1727203873.31495: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.31497: variable 'ansible_pipelining' from source: unknown 15896 1727203873.31499: variable 'ansible_timeout' from source: unknown 15896 1727203873.31501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.31706: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203873.31780: variable 'omit' from source: magic vars 15896 1727203873.31783: starting attempt loop 15896 1727203873.31786: running the handler 15896 1727203873.31788: _low_level_execute_command(): starting 15896 1727203873.31790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203873.32598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203873.32622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203873.32640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203873.32757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203873.34544: stdout chunk (state=3): >>>/root <<< 15896 1727203873.34709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203873.34712: stdout chunk (state=3): >>><<< 15896 1727203873.34715: stderr chunk (state=3): >>><<< 15896 1727203873.34735: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203873.34756: _low_level_execute_command(): starting 15896 1727203873.34843: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610 `" && echo ansible-tmp-1727203873.34742-17753-163707146918610="` echo /root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610 `" ) && sleep 0' 15896 1727203873.35439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203873.35443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203873.35446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203873.35449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203873.35452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203873.35462: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203873.35465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203873.35467: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203873.35478: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203873.35490: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203873.35547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203873.35550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203873.35552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203873.35590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203873.35599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203873.35612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203873.35630: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203873.35737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203873.37864: stdout chunk (state=3): >>>ansible-tmp-1727203873.34742-17753-163707146918610=/root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610 <<< 15896 1727203873.37999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203873.38017: stderr chunk (state=3): >>><<< 15896 1727203873.38031: stdout chunk (state=3): >>><<< 15896 1727203873.38052: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203873.34742-17753-163707146918610=/root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203873.38212: variable 'ansible_module_compression' from source: unknown 15896 1727203873.38215: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15896 1727203873.38224: variable 'ansible_facts' from source: unknown 15896 1727203873.38333: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/AnsiballZ_stat.py 15896 1727203873.38582: Sending initial data 15896 1727203873.38593: Sent initial data (151 bytes) 15896 1727203873.39144: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203873.39163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203873.39173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203873.39285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203873.41033: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203873.41112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203873.41205: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp5ct4f6y3 /root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/AnsiballZ_stat.py <<< 15896 1727203873.41209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/AnsiballZ_stat.py" <<< 15896 1727203873.41287: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp5ct4f6y3" to remote "/root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/AnsiballZ_stat.py" <<< 15896 1727203873.42198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203873.42236: stderr chunk (state=3): >>><<< 15896 1727203873.42246: stdout chunk (state=3): >>><<< 15896 1727203873.42318: done transferring module to remote 15896 1727203873.42345: _low_level_execute_command(): starting 15896 1727203873.42348: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/ /root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/AnsiballZ_stat.py && sleep 0' 15896 1727203873.42964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203873.43072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203873.43090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203873.43128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203873.43151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203873.43173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203873.43294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203873.45306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203873.45309: stdout chunk (state=3): >>><<< 15896 1727203873.45312: stderr chunk (state=3): >>><<< 15896 1727203873.45388: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203873.45392: _low_level_execute_command(): starting 15896 1727203873.45394: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/AnsiballZ_stat.py && sleep 0' 15896 1727203873.46079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203873.46106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203873.46122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203873.46166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203873.46192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203873.46282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203873.46308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203873.46345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203873.46370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203873.46515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203873.63204: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28643, "dev": 23, "nlink": 1, "atime": 1727203872.0089178, "mtime": 1727203872.0089178, "ctime": 1727203872.0089178, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15896 1727203873.64778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203873.64802: stderr chunk (state=3): >>><<< 15896 1727203873.64805: stdout chunk (state=3): >>><<< 15896 1727203873.64820: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28643, "dev": 23, "nlink": 1, "atime": 1727203872.0089178, "mtime": 1727203872.0089178, "ctime": 1727203872.0089178, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203873.64866: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203873.64877: _low_level_execute_command(): starting 15896 1727203873.64881: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203873.34742-17753-163707146918610/ > /dev/null 2>&1 && sleep 0' 15896 1727203873.65325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203873.65328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203873.65330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15896 1727203873.65333: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203873.65334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203873.65386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203873.65400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203873.65472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203873.67439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203873.67466: stderr chunk (state=3): >>><<< 15896 1727203873.67470: stdout chunk (state=3): >>><<< 15896 1727203873.67485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203873.67492: handler run complete 15896 1727203873.67523: attempt loop complete, returning result 15896 1727203873.67526: _execute() done 15896 1727203873.67528: dumping result to json 15896 1727203873.67535: done dumping result, returning 15896 1727203873.67543: done running TaskExecutor() for managed-node1/TASK: Get stat for interface nm-bond [028d2410-947f-fb83-b6ad-000000000337] 15896 1727203873.67546: sending task result for task 028d2410-947f-fb83-b6ad-000000000337 15896 1727203873.67653: done sending task result for task 028d2410-947f-fb83-b6ad-000000000337 15896 1727203873.67657: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727203872.0089178, "block_size": 4096, "blocks": 0, "ctime": 1727203872.0089178, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28643, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727203872.0089178, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15896 1727203873.67749: no more pending results, returning what we have 15896 1727203873.67753: results queue empty 15896 1727203873.67754: checking for any_errors_fatal 15896 1727203873.67756: done checking for any_errors_fatal 15896 1727203873.67756: checking for max_fail_percentage 15896 1727203873.67760: done checking for max_fail_percentage 15896 1727203873.67761: checking to see if all hosts have failed and the running result is not ok 15896 1727203873.67761: done checking to see if all hosts have failed 15896 1727203873.67762: getting the remaining hosts for this loop 15896 1727203873.67764: done getting the remaining hosts for this loop 15896 1727203873.67771: getting the next task for host managed-node1 15896 1727203873.67779: done getting next task for host managed-node1 15896 1727203873.67782: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15896 1727203873.67784: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203873.67788: getting variables 15896 1727203873.67789: in VariableManager get_vars() 15896 1727203873.67833: Calling all_inventory to load vars for managed-node1 15896 1727203873.67836: Calling groups_inventory to load vars for managed-node1 15896 1727203873.67838: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.67847: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.67850: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.67855: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.68710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.69574: done with get_vars() 15896 1727203873.69595: done getting variables 15896 1727203873.69638: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203873.69730: variable 'interface' from source: task vars 15896 1727203873.69733: variable 'controller_device' from source: play vars 15896 1727203873.69777: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:51:13 -0400 (0:00:00.404) 0:00:19.287 ***** 15896 1727203873.69805: entering _queue_task() for managed-node1/assert 15896 1727203873.70056: worker is 1 (out of 1 available) 15896 1727203873.70072: exiting _queue_task() for managed-node1/assert 15896 1727203873.70085: done queuing things up, now waiting for results queue to drain 15896 1727203873.70087: waiting for pending results... 15896 1727203873.70269: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'nm-bond' 15896 1727203873.70347: in run() - task 028d2410-947f-fb83-b6ad-00000000006f 15896 1727203873.70360: variable 'ansible_search_path' from source: unknown 15896 1727203873.70364: variable 'ansible_search_path' from source: unknown 15896 1727203873.70391: calling self._execute() 15896 1727203873.70465: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.70468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.70481: variable 'omit' from source: magic vars 15896 1727203873.70745: variable 'ansible_distribution_major_version' from source: facts 15896 1727203873.70756: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203873.70762: variable 'omit' from source: magic vars 15896 1727203873.70798: variable 'omit' from source: magic vars 15896 1727203873.70869: variable 'interface' from source: task vars 15896 1727203873.70872: variable 'controller_device' from source: play vars 15896 1727203873.70919: variable 'controller_device' from source: play vars 15896 1727203873.70932: variable 'omit' from source: magic vars 15896 1727203873.70968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203873.70996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203873.71013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203873.71026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203873.71036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203873.71063: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203873.71066: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.71068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.71137: Set connection var ansible_shell_type to sh 15896 1727203873.71143: Set connection var ansible_connection to ssh 15896 1727203873.71148: Set connection var ansible_shell_executable to /bin/sh 15896 1727203873.71153: Set connection var ansible_pipelining to False 15896 1727203873.71161: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203873.71164: Set connection var ansible_timeout to 10 15896 1727203873.71182: variable 'ansible_shell_executable' from source: unknown 15896 1727203873.71187: variable 'ansible_connection' from source: unknown 15896 1727203873.71189: variable 'ansible_module_compression' from source: unknown 15896 1727203873.71193: variable 'ansible_shell_type' from source: unknown 15896 1727203873.71195: variable 'ansible_shell_executable' from source: unknown 15896 1727203873.71197: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.71200: variable 'ansible_pipelining' from source: unknown 15896 1727203873.71202: variable 'ansible_timeout' from source: unknown 15896 1727203873.71204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.71306: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203873.71315: variable 'omit' from source: magic vars 15896 1727203873.71318: starting attempt loop 15896 1727203873.71320: running the handler 15896 1727203873.71413: variable 'interface_stat' from source: set_fact 15896 1727203873.71428: Evaluated conditional (interface_stat.stat.exists): True 15896 1727203873.71433: handler run complete 15896 1727203873.71447: attempt loop complete, returning result 15896 1727203873.71451: _execute() done 15896 1727203873.71454: dumping result to json 15896 1727203873.71457: done dumping result, returning 15896 1727203873.71462: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'nm-bond' [028d2410-947f-fb83-b6ad-00000000006f] 15896 1727203873.71468: sending task result for task 028d2410-947f-fb83-b6ad-00000000006f 15896 1727203873.71548: done sending task result for task 028d2410-947f-fb83-b6ad-00000000006f 15896 1727203873.71551: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203873.71607: no more pending results, returning what we have 15896 1727203873.71610: results queue empty 15896 1727203873.71611: checking for any_errors_fatal 15896 1727203873.71622: done checking for any_errors_fatal 15896 1727203873.71623: checking for max_fail_percentage 15896 1727203873.71625: done checking for max_fail_percentage 15896 1727203873.71625: checking to see if all hosts have failed and the running result is not ok 15896 1727203873.71626: done checking to see if all hosts have failed 15896 1727203873.71627: getting the remaining hosts for this loop 15896 1727203873.71628: done getting the remaining hosts for this loop 15896 1727203873.71631: getting the next task for host managed-node1 15896 1727203873.71638: done getting next task for host managed-node1 15896 1727203873.71641: ^ task is: TASK: Include the task 'assert_profile_present.yml' 15896 1727203873.71643: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203873.71647: getting variables 15896 1727203873.71648: in VariableManager get_vars() 15896 1727203873.71706: Calling all_inventory to load vars for managed-node1 15896 1727203873.71709: Calling groups_inventory to load vars for managed-node1 15896 1727203873.71711: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.71722: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.71724: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.71726: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.72530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.73486: done with get_vars() 15896 1727203873.73503: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Tuesday 24 September 2024 14:51:13 -0400 (0:00:00.037) 0:00:19.325 ***** 15896 1727203873.73574: entering _queue_task() for managed-node1/include_tasks 15896 1727203873.73837: worker is 1 (out of 1 available) 15896 1727203873.73852: exiting _queue_task() for managed-node1/include_tasks 15896 1727203873.73870: done queuing things up, now waiting for results queue to drain 15896 1727203873.73871: waiting for pending results... 15896 1727203873.74047: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' 15896 1727203873.74112: in run() - task 028d2410-947f-fb83-b6ad-000000000070 15896 1727203873.74123: variable 'ansible_search_path' from source: unknown 15896 1727203873.74165: variable 'controller_profile' from source: play vars 15896 1727203873.74308: variable 'controller_profile' from source: play vars 15896 1727203873.74321: variable 'port1_profile' from source: play vars 15896 1727203873.74370: variable 'port1_profile' from source: play vars 15896 1727203873.74379: variable 'port2_profile' from source: play vars 15896 1727203873.74424: variable 'port2_profile' from source: play vars 15896 1727203873.74435: variable 'omit' from source: magic vars 15896 1727203873.74538: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.74553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.74566: variable 'omit' from source: magic vars 15896 1727203873.74740: variable 'ansible_distribution_major_version' from source: facts 15896 1727203873.74747: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203873.74778: variable 'item' from source: unknown 15896 1727203873.74820: variable 'item' from source: unknown 15896 1727203873.74942: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.74945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.74948: variable 'omit' from source: magic vars 15896 1727203873.75027: variable 'ansible_distribution_major_version' from source: facts 15896 1727203873.75030: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203873.75051: variable 'item' from source: unknown 15896 1727203873.75100: variable 'item' from source: unknown 15896 1727203873.75163: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.75171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.75181: variable 'omit' from source: magic vars 15896 1727203873.75281: variable 'ansible_distribution_major_version' from source: facts 15896 1727203873.75285: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203873.75304: variable 'item' from source: unknown 15896 1727203873.75345: variable 'item' from source: unknown 15896 1727203873.75405: dumping result to json 15896 1727203873.75409: done dumping result, returning 15896 1727203873.75411: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' [028d2410-947f-fb83-b6ad-000000000070] 15896 1727203873.75413: sending task result for task 028d2410-947f-fb83-b6ad-000000000070 15896 1727203873.75445: done sending task result for task 028d2410-947f-fb83-b6ad-000000000070 15896 1727203873.75447: WORKER PROCESS EXITING 15896 1727203873.75473: no more pending results, returning what we have 15896 1727203873.75479: in VariableManager get_vars() 15896 1727203873.75532: Calling all_inventory to load vars for managed-node1 15896 1727203873.75534: Calling groups_inventory to load vars for managed-node1 15896 1727203873.75536: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.75549: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.75552: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.75554: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.79692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.80531: done with get_vars() 15896 1727203873.80546: variable 'ansible_search_path' from source: unknown 15896 1727203873.80561: variable 'ansible_search_path' from source: unknown 15896 1727203873.80567: variable 'ansible_search_path' from source: unknown 15896 1727203873.80573: we have included files to process 15896 1727203873.80574: generating all_blocks data 15896 1727203873.80576: done generating all_blocks data 15896 1727203873.80578: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15896 1727203873.80579: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15896 1727203873.80581: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15896 1727203873.80700: in VariableManager get_vars() 15896 1727203873.80720: done with get_vars() 15896 1727203873.80883: done processing included file 15896 1727203873.80884: iterating over new_blocks loaded from include file 15896 1727203873.80885: in VariableManager get_vars() 15896 1727203873.80904: done with get_vars() 15896 1727203873.80905: filtering new block on tags 15896 1727203873.80918: done filtering new block on tags 15896 1727203873.80919: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0) 15896 1727203873.80923: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15896 1727203873.80923: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15896 1727203873.80926: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15896 1727203873.80987: in VariableManager get_vars() 15896 1727203873.81006: done with get_vars() 15896 1727203873.81151: done processing included file 15896 1727203873.81153: iterating over new_blocks loaded from include file 15896 1727203873.81153: in VariableManager get_vars() 15896 1727203873.81170: done with get_vars() 15896 1727203873.81171: filtering new block on tags 15896 1727203873.81185: done filtering new block on tags 15896 1727203873.81186: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.0) 15896 1727203873.81189: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15896 1727203873.81189: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15896 1727203873.81191: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15896 1727203873.81295: in VariableManager get_vars() 15896 1727203873.81312: done with get_vars() 15896 1727203873.81457: done processing included file 15896 1727203873.81460: iterating over new_blocks loaded from include file 15896 1727203873.81461: in VariableManager get_vars() 15896 1727203873.81477: done with get_vars() 15896 1727203873.81478: filtering new block on tags 15896 1727203873.81490: done filtering new block on tags 15896 1727203873.81491: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.1) 15896 1727203873.81493: extending task lists for all hosts with included blocks 15896 1727203873.84494: done extending task lists 15896 1727203873.84501: done processing included files 15896 1727203873.84501: results queue empty 15896 1727203873.84502: checking for any_errors_fatal 15896 1727203873.84504: done checking for any_errors_fatal 15896 1727203873.84505: checking for max_fail_percentage 15896 1727203873.84506: done checking for max_fail_percentage 15896 1727203873.84506: checking to see if all hosts have failed and the running result is not ok 15896 1727203873.84507: done checking to see if all hosts have failed 15896 1727203873.84507: getting the remaining hosts for this loop 15896 1727203873.84508: done getting the remaining hosts for this loop 15896 1727203873.84509: getting the next task for host managed-node1 15896 1727203873.84512: done getting next task for host managed-node1 15896 1727203873.84513: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15896 1727203873.84515: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203873.84517: getting variables 15896 1727203873.84517: in VariableManager get_vars() 15896 1727203873.84535: Calling all_inventory to load vars for managed-node1 15896 1727203873.84537: Calling groups_inventory to load vars for managed-node1 15896 1727203873.84538: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.84543: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.84544: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.84546: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.85262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.86128: done with get_vars() 15896 1727203873.86146: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:51:13 -0400 (0:00:00.126) 0:00:19.451 ***** 15896 1727203873.86204: entering _queue_task() for managed-node1/include_tasks 15896 1727203873.86484: worker is 1 (out of 1 available) 15896 1727203873.86496: exiting _queue_task() for managed-node1/include_tasks 15896 1727203873.86509: done queuing things up, now waiting for results queue to drain 15896 1727203873.86511: waiting for pending results... 15896 1727203873.86760: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 15896 1727203873.86768: in run() - task 028d2410-947f-fb83-b6ad-000000000355 15896 1727203873.86771: variable 'ansible_search_path' from source: unknown 15896 1727203873.86774: variable 'ansible_search_path' from source: unknown 15896 1727203873.86786: calling self._execute() 15896 1727203873.86871: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.86877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.86888: variable 'omit' from source: magic vars 15896 1727203873.87172: variable 'ansible_distribution_major_version' from source: facts 15896 1727203873.87186: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203873.87189: _execute() done 15896 1727203873.87192: dumping result to json 15896 1727203873.87194: done dumping result, returning 15896 1727203873.87201: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-fb83-b6ad-000000000355] 15896 1727203873.87206: sending task result for task 028d2410-947f-fb83-b6ad-000000000355 15896 1727203873.87294: done sending task result for task 028d2410-947f-fb83-b6ad-000000000355 15896 1727203873.87296: WORKER PROCESS EXITING 15896 1727203873.87325: no more pending results, returning what we have 15896 1727203873.87330: in VariableManager get_vars() 15896 1727203873.87389: Calling all_inventory to load vars for managed-node1 15896 1727203873.87392: Calling groups_inventory to load vars for managed-node1 15896 1727203873.87394: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.87408: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.87410: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.87413: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.88233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.89223: done with get_vars() 15896 1727203873.89236: variable 'ansible_search_path' from source: unknown 15896 1727203873.89237: variable 'ansible_search_path' from source: unknown 15896 1727203873.89267: we have included files to process 15896 1727203873.89268: generating all_blocks data 15896 1727203873.89269: done generating all_blocks data 15896 1727203873.89270: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15896 1727203873.89271: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15896 1727203873.89272: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15896 1727203873.89934: done processing included file 15896 1727203873.89935: iterating over new_blocks loaded from include file 15896 1727203873.89936: in VariableManager get_vars() 15896 1727203873.89956: done with get_vars() 15896 1727203873.89957: filtering new block on tags 15896 1727203873.89973: done filtering new block on tags 15896 1727203873.89977: in VariableManager get_vars() 15896 1727203873.89994: done with get_vars() 15896 1727203873.89995: filtering new block on tags 15896 1727203873.90008: done filtering new block on tags 15896 1727203873.90010: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 15896 1727203873.90013: extending task lists for all hosts with included blocks 15896 1727203873.90114: done extending task lists 15896 1727203873.90115: done processing included files 15896 1727203873.90116: results queue empty 15896 1727203873.90116: checking for any_errors_fatal 15896 1727203873.90119: done checking for any_errors_fatal 15896 1727203873.90119: checking for max_fail_percentage 15896 1727203873.90120: done checking for max_fail_percentage 15896 1727203873.90120: checking to see if all hosts have failed and the running result is not ok 15896 1727203873.90121: done checking to see if all hosts have failed 15896 1727203873.90121: getting the remaining hosts for this loop 15896 1727203873.90122: done getting the remaining hosts for this loop 15896 1727203873.90124: getting the next task for host managed-node1 15896 1727203873.90126: done getting next task for host managed-node1 15896 1727203873.90129: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15896 1727203873.90131: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203873.90133: getting variables 15896 1727203873.90133: in VariableManager get_vars() 15896 1727203873.90189: Calling all_inventory to load vars for managed-node1 15896 1727203873.90191: Calling groups_inventory to load vars for managed-node1 15896 1727203873.90192: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.90196: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.90198: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.90199: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.90821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.91718: done with get_vars() 15896 1727203873.91737: done getting variables 15896 1727203873.91770: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:51:13 -0400 (0:00:00.055) 0:00:19.507 ***** 15896 1727203873.91792: entering _queue_task() for managed-node1/set_fact 15896 1727203873.92056: worker is 1 (out of 1 available) 15896 1727203873.92067: exiting _queue_task() for managed-node1/set_fact 15896 1727203873.92081: done queuing things up, now waiting for results queue to drain 15896 1727203873.92083: waiting for pending results... 15896 1727203873.92253: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15896 1727203873.92330: in run() - task 028d2410-947f-fb83-b6ad-0000000005e4 15896 1727203873.92344: variable 'ansible_search_path' from source: unknown 15896 1727203873.92347: variable 'ansible_search_path' from source: unknown 15896 1727203873.92378: calling self._execute() 15896 1727203873.92456: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.92460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.92472: variable 'omit' from source: magic vars 15896 1727203873.92746: variable 'ansible_distribution_major_version' from source: facts 15896 1727203873.92764: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203873.92767: variable 'omit' from source: magic vars 15896 1727203873.92798: variable 'omit' from source: magic vars 15896 1727203873.92822: variable 'omit' from source: magic vars 15896 1727203873.92863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203873.92888: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203873.92904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203873.92918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203873.92927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203873.92950: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203873.92953: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.92960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.93028: Set connection var ansible_shell_type to sh 15896 1727203873.93034: Set connection var ansible_connection to ssh 15896 1727203873.93039: Set connection var ansible_shell_executable to /bin/sh 15896 1727203873.93044: Set connection var ansible_pipelining to False 15896 1727203873.93049: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203873.93055: Set connection var ansible_timeout to 10 15896 1727203873.93084: variable 'ansible_shell_executable' from source: unknown 15896 1727203873.93088: variable 'ansible_connection' from source: unknown 15896 1727203873.93090: variable 'ansible_module_compression' from source: unknown 15896 1727203873.93093: variable 'ansible_shell_type' from source: unknown 15896 1727203873.93095: variable 'ansible_shell_executable' from source: unknown 15896 1727203873.93097: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.93099: variable 'ansible_pipelining' from source: unknown 15896 1727203873.93102: variable 'ansible_timeout' from source: unknown 15896 1727203873.93104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.93381: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203873.93385: variable 'omit' from source: magic vars 15896 1727203873.93387: starting attempt loop 15896 1727203873.93389: running the handler 15896 1727203873.93391: handler run complete 15896 1727203873.93394: attempt loop complete, returning result 15896 1727203873.93397: _execute() done 15896 1727203873.93399: dumping result to json 15896 1727203873.93402: done dumping result, returning 15896 1727203873.93404: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-fb83-b6ad-0000000005e4] 15896 1727203873.93407: sending task result for task 028d2410-947f-fb83-b6ad-0000000005e4 15896 1727203873.93477: done sending task result for task 028d2410-947f-fb83-b6ad-0000000005e4 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15896 1727203873.93545: no more pending results, returning what we have 15896 1727203873.93549: results queue empty 15896 1727203873.93550: checking for any_errors_fatal 15896 1727203873.93551: done checking for any_errors_fatal 15896 1727203873.93552: checking for max_fail_percentage 15896 1727203873.93553: done checking for max_fail_percentage 15896 1727203873.93554: checking to see if all hosts have failed and the running result is not ok 15896 1727203873.93555: done checking to see if all hosts have failed 15896 1727203873.93555: getting the remaining hosts for this loop 15896 1727203873.93557: done getting the remaining hosts for this loop 15896 1727203873.93562: getting the next task for host managed-node1 15896 1727203873.93569: done getting next task for host managed-node1 15896 1727203873.93572: ^ task is: TASK: Stat profile file 15896 1727203873.93578: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203873.93583: getting variables 15896 1727203873.93585: in VariableManager get_vars() 15896 1727203873.93636: Calling all_inventory to load vars for managed-node1 15896 1727203873.93639: Calling groups_inventory to load vars for managed-node1 15896 1727203873.93641: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203873.93652: Calling all_plugins_play to load vars for managed-node1 15896 1727203873.93654: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203873.93660: Calling groups_plugins_play to load vars for managed-node1 15896 1727203873.94182: WORKER PROCESS EXITING 15896 1727203873.95019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203873.96729: done with get_vars() 15896 1727203873.96758: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:51:13 -0400 (0:00:00.050) 0:00:19.557 ***** 15896 1727203873.96858: entering _queue_task() for managed-node1/stat 15896 1727203873.97214: worker is 1 (out of 1 available) 15896 1727203873.97225: exiting _queue_task() for managed-node1/stat 15896 1727203873.97238: done queuing things up, now waiting for results queue to drain 15896 1727203873.97240: waiting for pending results... 15896 1727203873.97528: running TaskExecutor() for managed-node1/TASK: Stat profile file 15896 1727203873.97682: in run() - task 028d2410-947f-fb83-b6ad-0000000005e5 15896 1727203873.97686: variable 'ansible_search_path' from source: unknown 15896 1727203873.97690: variable 'ansible_search_path' from source: unknown 15896 1727203873.97719: calling self._execute() 15896 1727203873.97882: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.97886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.97889: variable 'omit' from source: magic vars 15896 1727203873.98238: variable 'ansible_distribution_major_version' from source: facts 15896 1727203873.98262: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203873.98278: variable 'omit' from source: magic vars 15896 1727203873.98327: variable 'omit' from source: magic vars 15896 1727203873.98463: variable 'profile' from source: include params 15896 1727203873.98466: variable 'item' from source: include params 15896 1727203873.98508: variable 'item' from source: include params 15896 1727203873.98530: variable 'omit' from source: magic vars 15896 1727203873.98578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203873.98619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203873.98683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203873.98686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203873.98693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203873.98725: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203873.98732: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.98739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.98841: Set connection var ansible_shell_type to sh 15896 1727203873.98881: Set connection var ansible_connection to ssh 15896 1727203873.98884: Set connection var ansible_shell_executable to /bin/sh 15896 1727203873.98887: Set connection var ansible_pipelining to False 15896 1727203873.98889: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203873.98891: Set connection var ansible_timeout to 10 15896 1727203873.98914: variable 'ansible_shell_executable' from source: unknown 15896 1727203873.98923: variable 'ansible_connection' from source: unknown 15896 1727203873.98982: variable 'ansible_module_compression' from source: unknown 15896 1727203873.98985: variable 'ansible_shell_type' from source: unknown 15896 1727203873.98987: variable 'ansible_shell_executable' from source: unknown 15896 1727203873.98989: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203873.98991: variable 'ansible_pipelining' from source: unknown 15896 1727203873.98994: variable 'ansible_timeout' from source: unknown 15896 1727203873.98996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203873.99173: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203873.99196: variable 'omit' from source: magic vars 15896 1727203873.99209: starting attempt loop 15896 1727203873.99217: running the handler 15896 1727203873.99244: _low_level_execute_command(): starting 15896 1727203873.99335: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203873.99979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203874.00099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203874.00125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.00144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.00340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.02058: stdout chunk (state=3): >>>/root <<< 15896 1727203874.02283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.02287: stdout chunk (state=3): >>><<< 15896 1727203874.02290: stderr chunk (state=3): >>><<< 15896 1727203874.02314: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203874.02341: _low_level_execute_command(): starting 15896 1727203874.02429: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799 `" && echo ansible-tmp-1727203874.0232756-17778-225159959928799="` echo /root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799 `" ) && sleep 0' 15896 1727203874.02960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203874.02992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203874.03082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203874.03111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.03126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.03329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.05426: stdout chunk (state=3): >>>ansible-tmp-1727203874.0232756-17778-225159959928799=/root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799 <<< 15896 1727203874.05555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.05600: stderr chunk (state=3): >>><<< 15896 1727203874.05619: stdout chunk (state=3): >>><<< 15896 1727203874.05639: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203874.0232756-17778-225159959928799=/root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203874.05781: variable 'ansible_module_compression' from source: unknown 15896 1727203874.05785: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15896 1727203874.05824: variable 'ansible_facts' from source: unknown 15896 1727203874.05922: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/AnsiballZ_stat.py 15896 1727203874.06135: Sending initial data 15896 1727203874.06144: Sent initial data (153 bytes) 15896 1727203874.06702: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203874.06793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203874.06826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203874.06844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.06868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.06984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.08797: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203874.08900: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203874.08980: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp3u0tw10o /root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/AnsiballZ_stat.py <<< 15896 1727203874.08993: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/AnsiballZ_stat.py" <<< 15896 1727203874.09056: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15896 1727203874.09070: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp3u0tw10o" to remote "/root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/AnsiballZ_stat.py" <<< 15896 1727203874.10110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.10113: stdout chunk (state=3): >>><<< 15896 1727203874.10115: stderr chunk (state=3): >>><<< 15896 1727203874.10117: done transferring module to remote 15896 1727203874.10119: _low_level_execute_command(): starting 15896 1727203874.10122: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/ /root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/AnsiballZ_stat.py && sleep 0' 15896 1727203874.10689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203874.10703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203874.10717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203874.10733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203874.10774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203874.10869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203874.10873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.10903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.11009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.13012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.13067: stderr chunk (state=3): >>><<< 15896 1727203874.13070: stdout chunk (state=3): >>><<< 15896 1727203874.13118: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203874.13122: _low_level_execute_command(): starting 15896 1727203874.13124: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/AnsiballZ_stat.py && sleep 0' 15896 1727203874.13752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203874.13885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203874.13889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.13912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.14030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.30912: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15896 1727203874.32446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203874.32451: stdout chunk (state=3): >>><<< 15896 1727203874.32453: stderr chunk (state=3): >>><<< 15896 1727203874.32455: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203874.32460: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203874.32463: _low_level_execute_command(): starting 15896 1727203874.32465: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203874.0232756-17778-225159959928799/ > /dev/null 2>&1 && sleep 0' 15896 1727203874.33093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203874.33143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203874.33174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.33285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.35245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.35293: stderr chunk (state=3): >>><<< 15896 1727203874.35298: stdout chunk (state=3): >>><<< 15896 1727203874.35312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203874.35320: handler run complete 15896 1727203874.35335: attempt loop complete, returning result 15896 1727203874.35338: _execute() done 15896 1727203874.35340: dumping result to json 15896 1727203874.35343: done dumping result, returning 15896 1727203874.35351: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-fb83-b6ad-0000000005e5] 15896 1727203874.35353: sending task result for task 028d2410-947f-fb83-b6ad-0000000005e5 15896 1727203874.35448: done sending task result for task 028d2410-947f-fb83-b6ad-0000000005e5 15896 1727203874.35451: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 15896 1727203874.35515: no more pending results, returning what we have 15896 1727203874.35519: results queue empty 15896 1727203874.35519: checking for any_errors_fatal 15896 1727203874.35525: done checking for any_errors_fatal 15896 1727203874.35525: checking for max_fail_percentage 15896 1727203874.35527: done checking for max_fail_percentage 15896 1727203874.35528: checking to see if all hosts have failed and the running result is not ok 15896 1727203874.35528: done checking to see if all hosts have failed 15896 1727203874.35529: getting the remaining hosts for this loop 15896 1727203874.35530: done getting the remaining hosts for this loop 15896 1727203874.35534: getting the next task for host managed-node1 15896 1727203874.35539: done getting next task for host managed-node1 15896 1727203874.35541: ^ task is: TASK: Set NM profile exist flag based on the profile files 15896 1727203874.35545: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203874.35548: getting variables 15896 1727203874.35549: in VariableManager get_vars() 15896 1727203874.35615: Calling all_inventory to load vars for managed-node1 15896 1727203874.35618: Calling groups_inventory to load vars for managed-node1 15896 1727203874.35620: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203874.35629: Calling all_plugins_play to load vars for managed-node1 15896 1727203874.35632: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203874.35634: Calling groups_plugins_play to load vars for managed-node1 15896 1727203874.36598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203874.38482: done with get_vars() 15896 1727203874.38513: done getting variables 15896 1727203874.38583: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:51:14 -0400 (0:00:00.417) 0:00:19.975 ***** 15896 1727203874.38618: entering _queue_task() for managed-node1/set_fact 15896 1727203874.39001: worker is 1 (out of 1 available) 15896 1727203874.39183: exiting _queue_task() for managed-node1/set_fact 15896 1727203874.39195: done queuing things up, now waiting for results queue to drain 15896 1727203874.39197: waiting for pending results... 15896 1727203874.39437: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 15896 1727203874.39534: in run() - task 028d2410-947f-fb83-b6ad-0000000005e6 15896 1727203874.39538: variable 'ansible_search_path' from source: unknown 15896 1727203874.39540: variable 'ansible_search_path' from source: unknown 15896 1727203874.39549: calling self._execute() 15896 1727203874.39666: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203874.39681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203874.39697: variable 'omit' from source: magic vars 15896 1727203874.40094: variable 'ansible_distribution_major_version' from source: facts 15896 1727203874.40110: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203874.40294: variable 'profile_stat' from source: set_fact 15896 1727203874.40297: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203874.40300: when evaluation is False, skipping this task 15896 1727203874.40302: _execute() done 15896 1727203874.40305: dumping result to json 15896 1727203874.40306: done dumping result, returning 15896 1727203874.40309: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-fb83-b6ad-0000000005e6] 15896 1727203874.40310: sending task result for task 028d2410-947f-fb83-b6ad-0000000005e6 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203874.40542: no more pending results, returning what we have 15896 1727203874.40546: results queue empty 15896 1727203874.40547: checking for any_errors_fatal 15896 1727203874.40556: done checking for any_errors_fatal 15896 1727203874.40557: checking for max_fail_percentage 15896 1727203874.40561: done checking for max_fail_percentage 15896 1727203874.40562: checking to see if all hosts have failed and the running result is not ok 15896 1727203874.40563: done checking to see if all hosts have failed 15896 1727203874.40564: getting the remaining hosts for this loop 15896 1727203874.40565: done getting the remaining hosts for this loop 15896 1727203874.40569: getting the next task for host managed-node1 15896 1727203874.40582: done getting next task for host managed-node1 15896 1727203874.40585: ^ task is: TASK: Get NM profile info 15896 1727203874.40590: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203874.40594: getting variables 15896 1727203874.40596: in VariableManager get_vars() 15896 1727203874.40654: Calling all_inventory to load vars for managed-node1 15896 1727203874.40657: Calling groups_inventory to load vars for managed-node1 15896 1727203874.40662: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203874.40799: done sending task result for task 028d2410-947f-fb83-b6ad-0000000005e6 15896 1727203874.40802: WORKER PROCESS EXITING 15896 1727203874.40816: Calling all_plugins_play to load vars for managed-node1 15896 1727203874.40819: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203874.40822: Calling groups_plugins_play to load vars for managed-node1 15896 1727203874.42508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203874.44142: done with get_vars() 15896 1727203874.44173: done getting variables 15896 1727203874.44243: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:51:14 -0400 (0:00:00.056) 0:00:20.032 ***** 15896 1727203874.44277: entering _queue_task() for managed-node1/shell 15896 1727203874.44768: worker is 1 (out of 1 available) 15896 1727203874.44780: exiting _queue_task() for managed-node1/shell 15896 1727203874.44791: done queuing things up, now waiting for results queue to drain 15896 1727203874.44792: waiting for pending results... 15896 1727203874.44984: running TaskExecutor() for managed-node1/TASK: Get NM profile info 15896 1727203874.45124: in run() - task 028d2410-947f-fb83-b6ad-0000000005e7 15896 1727203874.45199: variable 'ansible_search_path' from source: unknown 15896 1727203874.45203: variable 'ansible_search_path' from source: unknown 15896 1727203874.45206: calling self._execute() 15896 1727203874.45317: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203874.45328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203874.45344: variable 'omit' from source: magic vars 15896 1727203874.45762: variable 'ansible_distribution_major_version' from source: facts 15896 1727203874.45849: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203874.45854: variable 'omit' from source: magic vars 15896 1727203874.45857: variable 'omit' from source: magic vars 15896 1727203874.45963: variable 'profile' from source: include params 15896 1727203874.45974: variable 'item' from source: include params 15896 1727203874.46048: variable 'item' from source: include params 15896 1727203874.46081: variable 'omit' from source: magic vars 15896 1727203874.46132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203874.46180: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203874.46207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203874.46283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203874.46286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203874.46293: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203874.46301: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203874.46309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203874.46421: Set connection var ansible_shell_type to sh 15896 1727203874.46433: Set connection var ansible_connection to ssh 15896 1727203874.46443: Set connection var ansible_shell_executable to /bin/sh 15896 1727203874.46453: Set connection var ansible_pipelining to False 15896 1727203874.46464: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203874.46474: Set connection var ansible_timeout to 10 15896 1727203874.46509: variable 'ansible_shell_executable' from source: unknown 15896 1727203874.46517: variable 'ansible_connection' from source: unknown 15896 1727203874.46523: variable 'ansible_module_compression' from source: unknown 15896 1727203874.46529: variable 'ansible_shell_type' from source: unknown 15896 1727203874.46534: variable 'ansible_shell_executable' from source: unknown 15896 1727203874.46581: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203874.46584: variable 'ansible_pipelining' from source: unknown 15896 1727203874.46586: variable 'ansible_timeout' from source: unknown 15896 1727203874.46588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203874.46715: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203874.46735: variable 'omit' from source: magic vars 15896 1727203874.46744: starting attempt loop 15896 1727203874.46751: running the handler 15896 1727203874.46769: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203874.46827: _low_level_execute_command(): starting 15896 1727203874.46830: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203874.47609: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203874.47710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.47738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.47826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.49684: stdout chunk (state=3): >>>/root <<< 15896 1727203874.49983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.49987: stdout chunk (state=3): >>><<< 15896 1727203874.49990: stderr chunk (state=3): >>><<< 15896 1727203874.49992: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203874.49995: _low_level_execute_command(): starting 15896 1727203874.50002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018 `" && echo ansible-tmp-1727203874.4992013-17802-50293576944018="` echo /root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018 `" ) && sleep 0' 15896 1727203874.50672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203874.50685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203874.50784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.50809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.50924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.53048: stdout chunk (state=3): >>>ansible-tmp-1727203874.4992013-17802-50293576944018=/root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018 <<< 15896 1727203874.53181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.53236: stderr chunk (state=3): >>><<< 15896 1727203874.53240: stdout chunk (state=3): >>><<< 15896 1727203874.53257: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203874.4992013-17802-50293576944018=/root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203874.53383: variable 'ansible_module_compression' from source: unknown 15896 1727203874.53386: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203874.53419: variable 'ansible_facts' from source: unknown 15896 1727203874.53511: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/AnsiballZ_command.py 15896 1727203874.53702: Sending initial data 15896 1727203874.53705: Sent initial data (155 bytes) 15896 1727203874.54332: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203874.54399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203874.54465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203874.54485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.54508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.54627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.56389: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203874.56480: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203874.56562: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp1ofc01g_ /root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/AnsiballZ_command.py <<< 15896 1727203874.56566: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/AnsiballZ_command.py" <<< 15896 1727203874.56645: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp1ofc01g_" to remote "/root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/AnsiballZ_command.py" <<< 15896 1727203874.57586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.57590: stdout chunk (state=3): >>><<< 15896 1727203874.57592: stderr chunk (state=3): >>><<< 15896 1727203874.57619: done transferring module to remote 15896 1727203874.57635: _low_level_execute_command(): starting 15896 1727203874.57646: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/ /root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/AnsiballZ_command.py && sleep 0' 15896 1727203874.58283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203874.58298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203874.58320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203874.58337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203874.58355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203874.58425: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203874.58473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203874.58503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.58527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.58640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.60660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.60672: stdout chunk (state=3): >>><<< 15896 1727203874.60688: stderr chunk (state=3): >>><<< 15896 1727203874.60709: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203874.60727: _low_level_execute_command(): starting 15896 1727203874.60739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/AnsiballZ_command.py && sleep 0' 15896 1727203874.61385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203874.61401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203874.61416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203874.61444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203874.61465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203874.61488: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203874.61555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203874.61602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.61624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.61745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.80679: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:51:14.783039", "end": "2024-09-24 14:51:14.805119", "delta": "0:00:00.022080", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203874.82481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203874.82506: stderr chunk (state=3): >>><<< 15896 1727203874.82509: stdout chunk (state=3): >>><<< 15896 1727203874.82530: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:51:14.783039", "end": "2024-09-24 14:51:14.805119", "delta": "0:00:00.022080", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203874.82560: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203874.82569: _low_level_execute_command(): starting 15896 1727203874.82574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203874.4992013-17802-50293576944018/ > /dev/null 2>&1 && sleep 0' 15896 1727203874.83028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203874.83036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203874.83038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203874.83040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203874.83042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203874.83086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203874.83103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203874.83186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203874.85164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203874.85192: stderr chunk (state=3): >>><<< 15896 1727203874.85195: stdout chunk (state=3): >>><<< 15896 1727203874.85211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203874.85217: handler run complete 15896 1727203874.85238: Evaluated conditional (False): False 15896 1727203874.85246: attempt loop complete, returning result 15896 1727203874.85248: _execute() done 15896 1727203874.85251: dumping result to json 15896 1727203874.85256: done dumping result, returning 15896 1727203874.85266: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-fb83-b6ad-0000000005e7] 15896 1727203874.85270: sending task result for task 028d2410-947f-fb83-b6ad-0000000005e7 15896 1727203874.85374: done sending task result for task 028d2410-947f-fb83-b6ad-0000000005e7 15896 1727203874.85379: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.022080", "end": "2024-09-24 14:51:14.805119", "rc": 0, "start": "2024-09-24 14:51:14.783039" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 15896 1727203874.85448: no more pending results, returning what we have 15896 1727203874.85453: results queue empty 15896 1727203874.85453: checking for any_errors_fatal 15896 1727203874.85459: done checking for any_errors_fatal 15896 1727203874.85460: checking for max_fail_percentage 15896 1727203874.85462: done checking for max_fail_percentage 15896 1727203874.85463: checking to see if all hosts have failed and the running result is not ok 15896 1727203874.85464: done checking to see if all hosts have failed 15896 1727203874.85464: getting the remaining hosts for this loop 15896 1727203874.85466: done getting the remaining hosts for this loop 15896 1727203874.85469: getting the next task for host managed-node1 15896 1727203874.85476: done getting next task for host managed-node1 15896 1727203874.85479: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15896 1727203874.85483: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203874.85487: getting variables 15896 1727203874.85488: in VariableManager get_vars() 15896 1727203874.85538: Calling all_inventory to load vars for managed-node1 15896 1727203874.85541: Calling groups_inventory to load vars for managed-node1 15896 1727203874.85543: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203874.85553: Calling all_plugins_play to load vars for managed-node1 15896 1727203874.85555: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203874.85558: Calling groups_plugins_play to load vars for managed-node1 15896 1727203874.86345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203874.87304: done with get_vars() 15896 1727203874.87321: done getting variables 15896 1727203874.87364: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:51:14 -0400 (0:00:00.431) 0:00:20.463 ***** 15896 1727203874.87388: entering _queue_task() for managed-node1/set_fact 15896 1727203874.87630: worker is 1 (out of 1 available) 15896 1727203874.87644: exiting _queue_task() for managed-node1/set_fact 15896 1727203874.87657: done queuing things up, now waiting for results queue to drain 15896 1727203874.87658: waiting for pending results... 15896 1727203874.87830: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15896 1727203874.87904: in run() - task 028d2410-947f-fb83-b6ad-0000000005e8 15896 1727203874.87917: variable 'ansible_search_path' from source: unknown 15896 1727203874.87920: variable 'ansible_search_path' from source: unknown 15896 1727203874.87948: calling self._execute() 15896 1727203874.88031: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203874.88035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203874.88044: variable 'omit' from source: magic vars 15896 1727203874.88320: variable 'ansible_distribution_major_version' from source: facts 15896 1727203874.88330: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203874.88420: variable 'nm_profile_exists' from source: set_fact 15896 1727203874.88433: Evaluated conditional (nm_profile_exists.rc == 0): True 15896 1727203874.88442: variable 'omit' from source: magic vars 15896 1727203874.88470: variable 'omit' from source: magic vars 15896 1727203874.88495: variable 'omit' from source: magic vars 15896 1727203874.88527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203874.88554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203874.88575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203874.88590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203874.88600: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203874.88623: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203874.88625: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203874.88628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203874.88702: Set connection var ansible_shell_type to sh 15896 1727203874.88708: Set connection var ansible_connection to ssh 15896 1727203874.88713: Set connection var ansible_shell_executable to /bin/sh 15896 1727203874.88718: Set connection var ansible_pipelining to False 15896 1727203874.88723: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203874.88728: Set connection var ansible_timeout to 10 15896 1727203874.88744: variable 'ansible_shell_executable' from source: unknown 15896 1727203874.88748: variable 'ansible_connection' from source: unknown 15896 1727203874.88751: variable 'ansible_module_compression' from source: unknown 15896 1727203874.88753: variable 'ansible_shell_type' from source: unknown 15896 1727203874.88756: variable 'ansible_shell_executable' from source: unknown 15896 1727203874.88758: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203874.88769: variable 'ansible_pipelining' from source: unknown 15896 1727203874.88773: variable 'ansible_timeout' from source: unknown 15896 1727203874.88777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203874.88870: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203874.88883: variable 'omit' from source: magic vars 15896 1727203874.88886: starting attempt loop 15896 1727203874.88889: running the handler 15896 1727203874.88899: handler run complete 15896 1727203874.88907: attempt loop complete, returning result 15896 1727203874.88910: _execute() done 15896 1727203874.88912: dumping result to json 15896 1727203874.88915: done dumping result, returning 15896 1727203874.88922: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-fb83-b6ad-0000000005e8] 15896 1727203874.88926: sending task result for task 028d2410-947f-fb83-b6ad-0000000005e8 15896 1727203874.89005: done sending task result for task 028d2410-947f-fb83-b6ad-0000000005e8 15896 1727203874.89008: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15896 1727203874.89059: no more pending results, returning what we have 15896 1727203874.89062: results queue empty 15896 1727203874.89063: checking for any_errors_fatal 15896 1727203874.89069: done checking for any_errors_fatal 15896 1727203874.89070: checking for max_fail_percentage 15896 1727203874.89072: done checking for max_fail_percentage 15896 1727203874.89073: checking to see if all hosts have failed and the running result is not ok 15896 1727203874.89073: done checking to see if all hosts have failed 15896 1727203874.89074: getting the remaining hosts for this loop 15896 1727203874.89077: done getting the remaining hosts for this loop 15896 1727203874.89080: getting the next task for host managed-node1 15896 1727203874.89088: done getting next task for host managed-node1 15896 1727203874.89090: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15896 1727203874.89094: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203874.89098: getting variables 15896 1727203874.89099: in VariableManager get_vars() 15896 1727203874.89144: Calling all_inventory to load vars for managed-node1 15896 1727203874.89146: Calling groups_inventory to load vars for managed-node1 15896 1727203874.89148: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203874.89156: Calling all_plugins_play to load vars for managed-node1 15896 1727203874.89159: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203874.89162: Calling groups_plugins_play to load vars for managed-node1 15896 1727203874.89922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203874.90772: done with get_vars() 15896 1727203874.90789: done getting variables 15896 1727203874.90842: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203874.90944: variable 'profile' from source: include params 15896 1727203874.90947: variable 'item' from source: include params 15896 1727203874.90992: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:51:14 -0400 (0:00:00.036) 0:00:20.499 ***** 15896 1727203874.91017: entering _queue_task() for managed-node1/command 15896 1727203874.91228: worker is 1 (out of 1 available) 15896 1727203874.91241: exiting _queue_task() for managed-node1/command 15896 1727203874.91253: done queuing things up, now waiting for results queue to drain 15896 1727203874.91254: waiting for pending results... 15896 1727203874.91421: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 15896 1727203874.91498: in run() - task 028d2410-947f-fb83-b6ad-0000000005ea 15896 1727203874.91512: variable 'ansible_search_path' from source: unknown 15896 1727203874.91516: variable 'ansible_search_path' from source: unknown 15896 1727203874.91542: calling self._execute() 15896 1727203874.91622: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203874.91626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203874.91635: variable 'omit' from source: magic vars 15896 1727203874.91893: variable 'ansible_distribution_major_version' from source: facts 15896 1727203874.91902: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203874.91986: variable 'profile_stat' from source: set_fact 15896 1727203874.91998: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203874.92001: when evaluation is False, skipping this task 15896 1727203874.92004: _execute() done 15896 1727203874.92006: dumping result to json 15896 1727203874.92009: done dumping result, returning 15896 1727203874.92016: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [028d2410-947f-fb83-b6ad-0000000005ea] 15896 1727203874.92029: sending task result for task 028d2410-947f-fb83-b6ad-0000000005ea 15896 1727203874.92104: done sending task result for task 028d2410-947f-fb83-b6ad-0000000005ea 15896 1727203874.92107: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203874.92186: no more pending results, returning what we have 15896 1727203874.92189: results queue empty 15896 1727203874.92190: checking for any_errors_fatal 15896 1727203874.92194: done checking for any_errors_fatal 15896 1727203874.92195: checking for max_fail_percentage 15896 1727203874.92197: done checking for max_fail_percentage 15896 1727203874.92197: checking to see if all hosts have failed and the running result is not ok 15896 1727203874.92198: done checking to see if all hosts have failed 15896 1727203874.92199: getting the remaining hosts for this loop 15896 1727203874.92200: done getting the remaining hosts for this loop 15896 1727203874.92203: getting the next task for host managed-node1 15896 1727203874.92208: done getting next task for host managed-node1 15896 1727203874.92210: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15896 1727203874.92213: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203874.92217: getting variables 15896 1727203874.92218: in VariableManager get_vars() 15896 1727203874.92261: Calling all_inventory to load vars for managed-node1 15896 1727203874.92264: Calling groups_inventory to load vars for managed-node1 15896 1727203874.92266: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203874.92277: Calling all_plugins_play to load vars for managed-node1 15896 1727203874.92279: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203874.92282: Calling groups_plugins_play to load vars for managed-node1 15896 1727203874.93333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203874.94443: done with get_vars() 15896 1727203874.94460: done getting variables 15896 1727203874.94504: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203874.94581: variable 'profile' from source: include params 15896 1727203874.94584: variable 'item' from source: include params 15896 1727203874.94625: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:51:14 -0400 (0:00:00.036) 0:00:20.535 ***** 15896 1727203874.94646: entering _queue_task() for managed-node1/set_fact 15896 1727203874.94880: worker is 1 (out of 1 available) 15896 1727203874.94892: exiting _queue_task() for managed-node1/set_fact 15896 1727203874.94904: done queuing things up, now waiting for results queue to drain 15896 1727203874.94906: waiting for pending results... 15896 1727203874.95067: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 15896 1727203874.95140: in run() - task 028d2410-947f-fb83-b6ad-0000000005eb 15896 1727203874.95149: variable 'ansible_search_path' from source: unknown 15896 1727203874.95152: variable 'ansible_search_path' from source: unknown 15896 1727203874.95182: calling self._execute() 15896 1727203874.95263: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203874.95267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203874.95274: variable 'omit' from source: magic vars 15896 1727203874.95537: variable 'ansible_distribution_major_version' from source: facts 15896 1727203874.95545: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203874.95629: variable 'profile_stat' from source: set_fact 15896 1727203874.95640: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203874.95643: when evaluation is False, skipping this task 15896 1727203874.95646: _execute() done 15896 1727203874.95649: dumping result to json 15896 1727203874.95652: done dumping result, returning 15896 1727203874.95660: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [028d2410-947f-fb83-b6ad-0000000005eb] 15896 1727203874.95663: sending task result for task 028d2410-947f-fb83-b6ad-0000000005eb 15896 1727203874.95783: done sending task result for task 028d2410-947f-fb83-b6ad-0000000005eb 15896 1727203874.95787: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203874.95834: no more pending results, returning what we have 15896 1727203874.95838: results queue empty 15896 1727203874.95839: checking for any_errors_fatal 15896 1727203874.95845: done checking for any_errors_fatal 15896 1727203874.95846: checking for max_fail_percentage 15896 1727203874.95847: done checking for max_fail_percentage 15896 1727203874.95848: checking to see if all hosts have failed and the running result is not ok 15896 1727203874.95849: done checking to see if all hosts have failed 15896 1727203874.95850: getting the remaining hosts for this loop 15896 1727203874.95852: done getting the remaining hosts for this loop 15896 1727203874.95856: getting the next task for host managed-node1 15896 1727203874.95866: done getting next task for host managed-node1 15896 1727203874.95868: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15896 1727203874.95873: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203874.95881: getting variables 15896 1727203874.95882: in VariableManager get_vars() 15896 1727203874.95935: Calling all_inventory to load vars for managed-node1 15896 1727203874.95938: Calling groups_inventory to load vars for managed-node1 15896 1727203874.95940: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203874.95953: Calling all_plugins_play to load vars for managed-node1 15896 1727203874.95956: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203874.95962: Calling groups_plugins_play to load vars for managed-node1 15896 1727203874.97668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203874.99231: done with get_vars() 15896 1727203874.99258: done getting variables 15896 1727203874.99320: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203874.99432: variable 'profile' from source: include params 15896 1727203874.99436: variable 'item' from source: include params 15896 1727203874.99497: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:51:14 -0400 (0:00:00.048) 0:00:20.584 ***** 15896 1727203874.99527: entering _queue_task() for managed-node1/command 15896 1727203874.99877: worker is 1 (out of 1 available) 15896 1727203874.99890: exiting _queue_task() for managed-node1/command 15896 1727203874.99903: done queuing things up, now waiting for results queue to drain 15896 1727203874.99905: waiting for pending results... 15896 1727203875.00187: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 15896 1727203875.00395: in run() - task 028d2410-947f-fb83-b6ad-0000000005ec 15896 1727203875.00398: variable 'ansible_search_path' from source: unknown 15896 1727203875.00402: variable 'ansible_search_path' from source: unknown 15896 1727203875.00405: calling self._execute() 15896 1727203875.00470: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.00485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.00504: variable 'omit' from source: magic vars 15896 1727203875.00948: variable 'ansible_distribution_major_version' from source: facts 15896 1727203875.00967: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203875.01118: variable 'profile_stat' from source: set_fact 15896 1727203875.01139: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203875.01152: when evaluation is False, skipping this task 15896 1727203875.01160: _execute() done 15896 1727203875.01169: dumping result to json 15896 1727203875.01179: done dumping result, returning 15896 1727203875.01190: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 [028d2410-947f-fb83-b6ad-0000000005ec] 15896 1727203875.01200: sending task result for task 028d2410-947f-fb83-b6ad-0000000005ec skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203875.01483: no more pending results, returning what we have 15896 1727203875.01487: results queue empty 15896 1727203875.01488: checking for any_errors_fatal 15896 1727203875.01496: done checking for any_errors_fatal 15896 1727203875.01497: checking for max_fail_percentage 15896 1727203875.01499: done checking for max_fail_percentage 15896 1727203875.01500: checking to see if all hosts have failed and the running result is not ok 15896 1727203875.01500: done checking to see if all hosts have failed 15896 1727203875.01502: getting the remaining hosts for this loop 15896 1727203875.01503: done getting the remaining hosts for this loop 15896 1727203875.01508: getting the next task for host managed-node1 15896 1727203875.01515: done getting next task for host managed-node1 15896 1727203875.01517: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15896 1727203875.01522: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203875.01526: getting variables 15896 1727203875.01528: in VariableManager get_vars() 15896 1727203875.01592: Calling all_inventory to load vars for managed-node1 15896 1727203875.01594: Calling groups_inventory to load vars for managed-node1 15896 1727203875.01597: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203875.01612: Calling all_plugins_play to load vars for managed-node1 15896 1727203875.01616: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203875.01619: Calling groups_plugins_play to load vars for managed-node1 15896 1727203875.02189: done sending task result for task 028d2410-947f-fb83-b6ad-0000000005ec 15896 1727203875.02193: WORKER PROCESS EXITING 15896 1727203875.04530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203875.07719: done with get_vars() 15896 1727203875.07751: done getting variables 15896 1727203875.07817: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203875.08268: variable 'profile' from source: include params 15896 1727203875.08273: variable 'item' from source: include params 15896 1727203875.08333: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:51:15 -0400 (0:00:00.088) 0:00:20.672 ***** 15896 1727203875.08365: entering _queue_task() for managed-node1/set_fact 15896 1727203875.09012: worker is 1 (out of 1 available) 15896 1727203875.09024: exiting _queue_task() for managed-node1/set_fact 15896 1727203875.09035: done queuing things up, now waiting for results queue to drain 15896 1727203875.09037: waiting for pending results... 15896 1727203875.09310: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 15896 1727203875.09430: in run() - task 028d2410-947f-fb83-b6ad-0000000005ed 15896 1727203875.09452: variable 'ansible_search_path' from source: unknown 15896 1727203875.09460: variable 'ansible_search_path' from source: unknown 15896 1727203875.09505: calling self._execute() 15896 1727203875.09625: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.09636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.09650: variable 'omit' from source: magic vars 15896 1727203875.10021: variable 'ansible_distribution_major_version' from source: facts 15896 1727203875.10281: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203875.10285: variable 'profile_stat' from source: set_fact 15896 1727203875.10288: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203875.10291: when evaluation is False, skipping this task 15896 1727203875.10293: _execute() done 15896 1727203875.10296: dumping result to json 15896 1727203875.10298: done dumping result, returning 15896 1727203875.10300: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [028d2410-947f-fb83-b6ad-0000000005ed] 15896 1727203875.10302: sending task result for task 028d2410-947f-fb83-b6ad-0000000005ed 15896 1727203875.10372: done sending task result for task 028d2410-947f-fb83-b6ad-0000000005ed 15896 1727203875.10377: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203875.10427: no more pending results, returning what we have 15896 1727203875.10432: results queue empty 15896 1727203875.10433: checking for any_errors_fatal 15896 1727203875.10441: done checking for any_errors_fatal 15896 1727203875.10442: checking for max_fail_percentage 15896 1727203875.10444: done checking for max_fail_percentage 15896 1727203875.10445: checking to see if all hosts have failed and the running result is not ok 15896 1727203875.10445: done checking to see if all hosts have failed 15896 1727203875.10446: getting the remaining hosts for this loop 15896 1727203875.10448: done getting the remaining hosts for this loop 15896 1727203875.10452: getting the next task for host managed-node1 15896 1727203875.10461: done getting next task for host managed-node1 15896 1727203875.10464: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15896 1727203875.10467: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203875.10472: getting variables 15896 1727203875.10474: in VariableManager get_vars() 15896 1727203875.10538: Calling all_inventory to load vars for managed-node1 15896 1727203875.10541: Calling groups_inventory to load vars for managed-node1 15896 1727203875.10543: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203875.10557: Calling all_plugins_play to load vars for managed-node1 15896 1727203875.10560: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203875.10563: Calling groups_plugins_play to load vars for managed-node1 15896 1727203875.12244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203875.13643: done with get_vars() 15896 1727203875.13678: done getting variables 15896 1727203875.13742: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203875.13856: variable 'profile' from source: include params 15896 1727203875.13860: variable 'item' from source: include params 15896 1727203875.13918: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:51:15 -0400 (0:00:00.055) 0:00:20.728 ***** 15896 1727203875.13948: entering _queue_task() for managed-node1/assert 15896 1727203875.14299: worker is 1 (out of 1 available) 15896 1727203875.14311: exiting _queue_task() for managed-node1/assert 15896 1727203875.14323: done queuing things up, now waiting for results queue to drain 15896 1727203875.14325: waiting for pending results... 15896 1727203875.14605: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' 15896 1727203875.14716: in run() - task 028d2410-947f-fb83-b6ad-000000000356 15896 1727203875.14739: variable 'ansible_search_path' from source: unknown 15896 1727203875.14785: variable 'ansible_search_path' from source: unknown 15896 1727203875.14800: calling self._execute() 15896 1727203875.14914: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.14926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.14941: variable 'omit' from source: magic vars 15896 1727203875.15316: variable 'ansible_distribution_major_version' from source: facts 15896 1727203875.15345: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203875.15348: variable 'omit' from source: magic vars 15896 1727203875.15580: variable 'omit' from source: magic vars 15896 1727203875.15584: variable 'profile' from source: include params 15896 1727203875.15587: variable 'item' from source: include params 15896 1727203875.15590: variable 'item' from source: include params 15896 1727203875.15592: variable 'omit' from source: magic vars 15896 1727203875.15640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203875.15684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203875.15717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203875.15742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.15760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.15798: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203875.15807: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.15818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.15925: Set connection var ansible_shell_type to sh 15896 1727203875.15940: Set connection var ansible_connection to ssh 15896 1727203875.15951: Set connection var ansible_shell_executable to /bin/sh 15896 1727203875.15961: Set connection var ansible_pipelining to False 15896 1727203875.15971: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203875.15984: Set connection var ansible_timeout to 10 15896 1727203875.16013: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.16020: variable 'ansible_connection' from source: unknown 15896 1727203875.16029: variable 'ansible_module_compression' from source: unknown 15896 1727203875.16039: variable 'ansible_shell_type' from source: unknown 15896 1727203875.16045: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.16051: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.16057: variable 'ansible_pipelining' from source: unknown 15896 1727203875.16064: variable 'ansible_timeout' from source: unknown 15896 1727203875.16071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.16224: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203875.16242: variable 'omit' from source: magic vars 15896 1727203875.16259: starting attempt loop 15896 1727203875.16364: running the handler 15896 1727203875.16395: variable 'lsr_net_profile_exists' from source: set_fact 15896 1727203875.16407: Evaluated conditional (lsr_net_profile_exists): True 15896 1727203875.16418: handler run complete 15896 1727203875.16439: attempt loop complete, returning result 15896 1727203875.16447: _execute() done 15896 1727203875.16454: dumping result to json 15896 1727203875.16462: done dumping result, returning 15896 1727203875.16480: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' [028d2410-947f-fb83-b6ad-000000000356] 15896 1727203875.16493: sending task result for task 028d2410-947f-fb83-b6ad-000000000356 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203875.16744: no more pending results, returning what we have 15896 1727203875.16748: results queue empty 15896 1727203875.16748: checking for any_errors_fatal 15896 1727203875.16755: done checking for any_errors_fatal 15896 1727203875.16756: checking for max_fail_percentage 15896 1727203875.16758: done checking for max_fail_percentage 15896 1727203875.16759: checking to see if all hosts have failed and the running result is not ok 15896 1727203875.16759: done checking to see if all hosts have failed 15896 1727203875.16760: getting the remaining hosts for this loop 15896 1727203875.16762: done getting the remaining hosts for this loop 15896 1727203875.16765: getting the next task for host managed-node1 15896 1727203875.16771: done getting next task for host managed-node1 15896 1727203875.16774: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15896 1727203875.16778: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203875.16783: getting variables 15896 1727203875.16784: in VariableManager get_vars() 15896 1727203875.16841: Calling all_inventory to load vars for managed-node1 15896 1727203875.16843: Calling groups_inventory to load vars for managed-node1 15896 1727203875.16846: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203875.16858: Calling all_plugins_play to load vars for managed-node1 15896 1727203875.16861: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203875.16864: Calling groups_plugins_play to load vars for managed-node1 15896 1727203875.17532: done sending task result for task 028d2410-947f-fb83-b6ad-000000000356 15896 1727203875.17535: WORKER PROCESS EXITING 15896 1727203875.18688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203875.21993: done with get_vars() 15896 1727203875.22202: done getting variables 15896 1727203875.22267: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203875.22371: variable 'profile' from source: include params 15896 1727203875.22577: variable 'item' from source: include params 15896 1727203875.22637: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:51:15 -0400 (0:00:00.087) 0:00:20.816 ***** 15896 1727203875.22672: entering _queue_task() for managed-node1/assert 15896 1727203875.23470: worker is 1 (out of 1 available) 15896 1727203875.23556: exiting _queue_task() for managed-node1/assert 15896 1727203875.23570: done queuing things up, now waiting for results queue to drain 15896 1727203875.23572: waiting for pending results... 15896 1727203875.24194: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' 15896 1727203875.24199: in run() - task 028d2410-947f-fb83-b6ad-000000000357 15896 1727203875.24202: variable 'ansible_search_path' from source: unknown 15896 1727203875.24205: variable 'ansible_search_path' from source: unknown 15896 1727203875.24583: calling self._execute() 15896 1727203875.24587: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.24590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.24592: variable 'omit' from source: magic vars 15896 1727203875.25271: variable 'ansible_distribution_major_version' from source: facts 15896 1727203875.25292: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203875.25303: variable 'omit' from source: magic vars 15896 1727203875.25346: variable 'omit' from source: magic vars 15896 1727203875.25686: variable 'profile' from source: include params 15896 1727203875.25697: variable 'item' from source: include params 15896 1727203875.25764: variable 'item' from source: include params 15896 1727203875.25794: variable 'omit' from source: magic vars 15896 1727203875.25839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203875.26281: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203875.26284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203875.26286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.26289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.26291: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203875.26293: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.26294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.26369: Set connection var ansible_shell_type to sh 15896 1727203875.26385: Set connection var ansible_connection to ssh 15896 1727203875.26396: Set connection var ansible_shell_executable to /bin/sh 15896 1727203875.26411: Set connection var ansible_pipelining to False 15896 1727203875.26457: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203875.26470: Set connection var ansible_timeout to 10 15896 1727203875.26499: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.26507: variable 'ansible_connection' from source: unknown 15896 1727203875.26515: variable 'ansible_module_compression' from source: unknown 15896 1727203875.26521: variable 'ansible_shell_type' from source: unknown 15896 1727203875.26526: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.26530: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.26535: variable 'ansible_pipelining' from source: unknown 15896 1727203875.26540: variable 'ansible_timeout' from source: unknown 15896 1727203875.26546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.26683: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203875.26698: variable 'omit' from source: magic vars 15896 1727203875.26707: starting attempt loop 15896 1727203875.26712: running the handler 15896 1727203875.26819: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15896 1727203875.26830: Evaluated conditional (lsr_net_profile_ansible_managed): True 15896 1727203875.26840: handler run complete 15896 1727203875.26859: attempt loop complete, returning result 15896 1727203875.26866: _execute() done 15896 1727203875.26873: dumping result to json 15896 1727203875.26883: done dumping result, returning 15896 1727203875.26895: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' [028d2410-947f-fb83-b6ad-000000000357] 15896 1727203875.26905: sending task result for task 028d2410-947f-fb83-b6ad-000000000357 15896 1727203875.27014: done sending task result for task 028d2410-947f-fb83-b6ad-000000000357 15896 1727203875.27021: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203875.27079: no more pending results, returning what we have 15896 1727203875.27082: results queue empty 15896 1727203875.27083: checking for any_errors_fatal 15896 1727203875.27088: done checking for any_errors_fatal 15896 1727203875.27089: checking for max_fail_percentage 15896 1727203875.27091: done checking for max_fail_percentage 15896 1727203875.27091: checking to see if all hosts have failed and the running result is not ok 15896 1727203875.27092: done checking to see if all hosts have failed 15896 1727203875.27093: getting the remaining hosts for this loop 15896 1727203875.27095: done getting the remaining hosts for this loop 15896 1727203875.27098: getting the next task for host managed-node1 15896 1727203875.27104: done getting next task for host managed-node1 15896 1727203875.27106: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15896 1727203875.27109: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203875.27114: getting variables 15896 1727203875.27115: in VariableManager get_vars() 15896 1727203875.27416: Calling all_inventory to load vars for managed-node1 15896 1727203875.27419: Calling groups_inventory to load vars for managed-node1 15896 1727203875.27421: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203875.27433: Calling all_plugins_play to load vars for managed-node1 15896 1727203875.27436: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203875.27440: Calling groups_plugins_play to load vars for managed-node1 15896 1727203875.29192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203875.32819: done with get_vars() 15896 1727203875.32843: done getting variables 15896 1727203875.33019: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203875.33235: variable 'profile' from source: include params 15896 1727203875.33239: variable 'item' from source: include params 15896 1727203875.33438: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:51:15 -0400 (0:00:00.108) 0:00:20.924 ***** 15896 1727203875.33479: entering _queue_task() for managed-node1/assert 15896 1727203875.34237: worker is 1 (out of 1 available) 15896 1727203875.34250: exiting _queue_task() for managed-node1/assert 15896 1727203875.34266: done queuing things up, now waiting for results queue to drain 15896 1727203875.34267: waiting for pending results... 15896 1727203875.34662: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 15896 1727203875.34777: in run() - task 028d2410-947f-fb83-b6ad-000000000358 15896 1727203875.35182: variable 'ansible_search_path' from source: unknown 15896 1727203875.35185: variable 'ansible_search_path' from source: unknown 15896 1727203875.35188: calling self._execute() 15896 1727203875.35500: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.35584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.35589: variable 'omit' from source: magic vars 15896 1727203875.36585: variable 'ansible_distribution_major_version' from source: facts 15896 1727203875.36636: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203875.36651: variable 'omit' from source: magic vars 15896 1727203875.36778: variable 'omit' from source: magic vars 15896 1727203875.37582: variable 'profile' from source: include params 15896 1727203875.37586: variable 'item' from source: include params 15896 1727203875.37589: variable 'item' from source: include params 15896 1727203875.37591: variable 'omit' from source: magic vars 15896 1727203875.37593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203875.37595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203875.37715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203875.37738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.37756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.37792: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203875.37822: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.37830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.38071: Set connection var ansible_shell_type to sh 15896 1727203875.38102: Set connection var ansible_connection to ssh 15896 1727203875.38112: Set connection var ansible_shell_executable to /bin/sh 15896 1727203875.38121: Set connection var ansible_pipelining to False 15896 1727203875.38129: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203875.38156: Set connection var ansible_timeout to 10 15896 1727203875.38212: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.38237: variable 'ansible_connection' from source: unknown 15896 1727203875.38258: variable 'ansible_module_compression' from source: unknown 15896 1727203875.38267: variable 'ansible_shell_type' from source: unknown 15896 1727203875.38274: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.38283: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.38290: variable 'ansible_pipelining' from source: unknown 15896 1727203875.38297: variable 'ansible_timeout' from source: unknown 15896 1727203875.38304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.38456: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203875.38479: variable 'omit' from source: magic vars 15896 1727203875.38490: starting attempt loop 15896 1727203875.38496: running the handler 15896 1727203875.38621: variable 'lsr_net_profile_fingerprint' from source: set_fact 15896 1727203875.38631: Evaluated conditional (lsr_net_profile_fingerprint): True 15896 1727203875.38641: handler run complete 15896 1727203875.38662: attempt loop complete, returning result 15896 1727203875.38669: _execute() done 15896 1727203875.38678: dumping result to json 15896 1727203875.38689: done dumping result, returning 15896 1727203875.38700: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 [028d2410-947f-fb83-b6ad-000000000358] 15896 1727203875.38708: sending task result for task 028d2410-947f-fb83-b6ad-000000000358 15896 1727203875.39081: done sending task result for task 028d2410-947f-fb83-b6ad-000000000358 15896 1727203875.39085: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203875.39126: no more pending results, returning what we have 15896 1727203875.39128: results queue empty 15896 1727203875.39129: checking for any_errors_fatal 15896 1727203875.39135: done checking for any_errors_fatal 15896 1727203875.39136: checking for max_fail_percentage 15896 1727203875.39138: done checking for max_fail_percentage 15896 1727203875.39138: checking to see if all hosts have failed and the running result is not ok 15896 1727203875.39139: done checking to see if all hosts have failed 15896 1727203875.39140: getting the remaining hosts for this loop 15896 1727203875.39141: done getting the remaining hosts for this loop 15896 1727203875.39144: getting the next task for host managed-node1 15896 1727203875.39152: done getting next task for host managed-node1 15896 1727203875.39154: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15896 1727203875.39157: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203875.39160: getting variables 15896 1727203875.39161: in VariableManager get_vars() 15896 1727203875.39209: Calling all_inventory to load vars for managed-node1 15896 1727203875.39211: Calling groups_inventory to load vars for managed-node1 15896 1727203875.39214: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203875.39223: Calling all_plugins_play to load vars for managed-node1 15896 1727203875.39226: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203875.39229: Calling groups_plugins_play to load vars for managed-node1 15896 1727203875.40955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203875.42912: done with get_vars() 15896 1727203875.42941: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:51:15 -0400 (0:00:00.095) 0:00:21.019 ***** 15896 1727203875.43042: entering _queue_task() for managed-node1/include_tasks 15896 1727203875.43402: worker is 1 (out of 1 available) 15896 1727203875.43414: exiting _queue_task() for managed-node1/include_tasks 15896 1727203875.43427: done queuing things up, now waiting for results queue to drain 15896 1727203875.43429: waiting for pending results... 15896 1727203875.43837: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 15896 1727203875.43963: in run() - task 028d2410-947f-fb83-b6ad-00000000035c 15896 1727203875.43988: variable 'ansible_search_path' from source: unknown 15896 1727203875.43996: variable 'ansible_search_path' from source: unknown 15896 1727203875.44033: calling self._execute() 15896 1727203875.44141: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.44157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.44181: variable 'omit' from source: magic vars 15896 1727203875.44552: variable 'ansible_distribution_major_version' from source: facts 15896 1727203875.44570: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203875.44586: _execute() done 15896 1727203875.44594: dumping result to json 15896 1727203875.44692: done dumping result, returning 15896 1727203875.44696: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-fb83-b6ad-00000000035c] 15896 1727203875.44698: sending task result for task 028d2410-947f-fb83-b6ad-00000000035c 15896 1727203875.44770: done sending task result for task 028d2410-947f-fb83-b6ad-00000000035c 15896 1727203875.44774: WORKER PROCESS EXITING 15896 1727203875.44821: no more pending results, returning what we have 15896 1727203875.44827: in VariableManager get_vars() 15896 1727203875.44893: Calling all_inventory to load vars for managed-node1 15896 1727203875.44896: Calling groups_inventory to load vars for managed-node1 15896 1727203875.44899: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203875.44912: Calling all_plugins_play to load vars for managed-node1 15896 1727203875.44916: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203875.44919: Calling groups_plugins_play to load vars for managed-node1 15896 1727203875.46722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203875.48792: done with get_vars() 15896 1727203875.48813: variable 'ansible_search_path' from source: unknown 15896 1727203875.48815: variable 'ansible_search_path' from source: unknown 15896 1727203875.48847: we have included files to process 15896 1727203875.48848: generating all_blocks data 15896 1727203875.48850: done generating all_blocks data 15896 1727203875.48854: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15896 1727203875.48855: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15896 1727203875.48857: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15896 1727203875.50217: done processing included file 15896 1727203875.50219: iterating over new_blocks loaded from include file 15896 1727203875.50221: in VariableManager get_vars() 15896 1727203875.50252: done with get_vars() 15896 1727203875.50254: filtering new block on tags 15896 1727203875.50291: done filtering new block on tags 15896 1727203875.50294: in VariableManager get_vars() 15896 1727203875.50321: done with get_vars() 15896 1727203875.50323: filtering new block on tags 15896 1727203875.50344: done filtering new block on tags 15896 1727203875.50346: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 15896 1727203875.50351: extending task lists for all hosts with included blocks 15896 1727203875.50545: done extending task lists 15896 1727203875.50546: done processing included files 15896 1727203875.50547: results queue empty 15896 1727203875.50548: checking for any_errors_fatal 15896 1727203875.50551: done checking for any_errors_fatal 15896 1727203875.50552: checking for max_fail_percentage 15896 1727203875.50553: done checking for max_fail_percentage 15896 1727203875.50553: checking to see if all hosts have failed and the running result is not ok 15896 1727203875.50554: done checking to see if all hosts have failed 15896 1727203875.50555: getting the remaining hosts for this loop 15896 1727203875.50556: done getting the remaining hosts for this loop 15896 1727203875.50561: getting the next task for host managed-node1 15896 1727203875.50565: done getting next task for host managed-node1 15896 1727203875.50567: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15896 1727203875.50570: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203875.50573: getting variables 15896 1727203875.50574: in VariableManager get_vars() 15896 1727203875.50600: Calling all_inventory to load vars for managed-node1 15896 1727203875.50603: Calling groups_inventory to load vars for managed-node1 15896 1727203875.50608: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203875.50614: Calling all_plugins_play to load vars for managed-node1 15896 1727203875.50616: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203875.50619: Calling groups_plugins_play to load vars for managed-node1 15896 1727203875.52195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203875.54044: done with get_vars() 15896 1727203875.54077: done getting variables 15896 1727203875.54236: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:51:15 -0400 (0:00:00.112) 0:00:21.132 ***** 15896 1727203875.54271: entering _queue_task() for managed-node1/set_fact 15896 1727203875.55092: worker is 1 (out of 1 available) 15896 1727203875.55104: exiting _queue_task() for managed-node1/set_fact 15896 1727203875.55117: done queuing things up, now waiting for results queue to drain 15896 1727203875.55119: waiting for pending results... 15896 1727203875.55398: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15896 1727203875.55514: in run() - task 028d2410-947f-fb83-b6ad-00000000062c 15896 1727203875.55533: variable 'ansible_search_path' from source: unknown 15896 1727203875.55536: variable 'ansible_search_path' from source: unknown 15896 1727203875.55571: calling self._execute() 15896 1727203875.55684: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.55688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.55700: variable 'omit' from source: magic vars 15896 1727203875.56102: variable 'ansible_distribution_major_version' from source: facts 15896 1727203875.56114: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203875.56183: variable 'omit' from source: magic vars 15896 1727203875.56187: variable 'omit' from source: magic vars 15896 1727203875.56213: variable 'omit' from source: magic vars 15896 1727203875.56258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203875.56305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203875.56324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203875.56341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.56353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.56479: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203875.56482: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.56485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.56780: Set connection var ansible_shell_type to sh 15896 1727203875.56784: Set connection var ansible_connection to ssh 15896 1727203875.56786: Set connection var ansible_shell_executable to /bin/sh 15896 1727203875.56789: Set connection var ansible_pipelining to False 15896 1727203875.56791: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203875.56793: Set connection var ansible_timeout to 10 15896 1727203875.56796: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.56799: variable 'ansible_connection' from source: unknown 15896 1727203875.56801: variable 'ansible_module_compression' from source: unknown 15896 1727203875.56803: variable 'ansible_shell_type' from source: unknown 15896 1727203875.56806: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.56809: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.56811: variable 'ansible_pipelining' from source: unknown 15896 1727203875.56814: variable 'ansible_timeout' from source: unknown 15896 1727203875.56816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.56819: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203875.56822: variable 'omit' from source: magic vars 15896 1727203875.56825: starting attempt loop 15896 1727203875.56828: running the handler 15896 1727203875.56830: handler run complete 15896 1727203875.56833: attempt loop complete, returning result 15896 1727203875.56835: _execute() done 15896 1727203875.56837: dumping result to json 15896 1727203875.56840: done dumping result, returning 15896 1727203875.56842: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-fb83-b6ad-00000000062c] 15896 1727203875.56845: sending task result for task 028d2410-947f-fb83-b6ad-00000000062c 15896 1727203875.56930: done sending task result for task 028d2410-947f-fb83-b6ad-00000000062c 15896 1727203875.56932: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15896 1727203875.57079: no more pending results, returning what we have 15896 1727203875.57085: results queue empty 15896 1727203875.57085: checking for any_errors_fatal 15896 1727203875.57088: done checking for any_errors_fatal 15896 1727203875.57088: checking for max_fail_percentage 15896 1727203875.57090: done checking for max_fail_percentage 15896 1727203875.57091: checking to see if all hosts have failed and the running result is not ok 15896 1727203875.57091: done checking to see if all hosts have failed 15896 1727203875.57092: getting the remaining hosts for this loop 15896 1727203875.57094: done getting the remaining hosts for this loop 15896 1727203875.57097: getting the next task for host managed-node1 15896 1727203875.57104: done getting next task for host managed-node1 15896 1727203875.57106: ^ task is: TASK: Stat profile file 15896 1727203875.57111: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203875.57115: getting variables 15896 1727203875.57116: in VariableManager get_vars() 15896 1727203875.57243: Calling all_inventory to load vars for managed-node1 15896 1727203875.57246: Calling groups_inventory to load vars for managed-node1 15896 1727203875.57249: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203875.57262: Calling all_plugins_play to load vars for managed-node1 15896 1727203875.57265: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203875.57268: Calling groups_plugins_play to load vars for managed-node1 15896 1727203875.63278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203875.64815: done with get_vars() 15896 1727203875.64838: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:51:15 -0400 (0:00:00.106) 0:00:21.238 ***** 15896 1727203875.64927: entering _queue_task() for managed-node1/stat 15896 1727203875.65495: worker is 1 (out of 1 available) 15896 1727203875.65504: exiting _queue_task() for managed-node1/stat 15896 1727203875.65513: done queuing things up, now waiting for results queue to drain 15896 1727203875.65515: waiting for pending results... 15896 1727203875.65645: running TaskExecutor() for managed-node1/TASK: Stat profile file 15896 1727203875.65784: in run() - task 028d2410-947f-fb83-b6ad-00000000062d 15896 1727203875.65789: variable 'ansible_search_path' from source: unknown 15896 1727203875.65792: variable 'ansible_search_path' from source: unknown 15896 1727203875.65803: calling self._execute() 15896 1727203875.65981: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.65986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.65990: variable 'omit' from source: magic vars 15896 1727203875.66481: variable 'ansible_distribution_major_version' from source: facts 15896 1727203875.66485: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203875.66489: variable 'omit' from source: magic vars 15896 1727203875.66492: variable 'omit' from source: magic vars 15896 1727203875.66494: variable 'profile' from source: include params 15896 1727203875.66497: variable 'item' from source: include params 15896 1727203875.66681: variable 'item' from source: include params 15896 1727203875.66685: variable 'omit' from source: magic vars 15896 1727203875.66688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203875.66691: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203875.66697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203875.66721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.66733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203875.66759: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203875.66767: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.66770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.67181: Set connection var ansible_shell_type to sh 15896 1727203875.67184: Set connection var ansible_connection to ssh 15896 1727203875.67186: Set connection var ansible_shell_executable to /bin/sh 15896 1727203875.67188: Set connection var ansible_pipelining to False 15896 1727203875.67191: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203875.67193: Set connection var ansible_timeout to 10 15896 1727203875.67195: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.67197: variable 'ansible_connection' from source: unknown 15896 1727203875.67199: variable 'ansible_module_compression' from source: unknown 15896 1727203875.67201: variable 'ansible_shell_type' from source: unknown 15896 1727203875.67203: variable 'ansible_shell_executable' from source: unknown 15896 1727203875.67206: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203875.67208: variable 'ansible_pipelining' from source: unknown 15896 1727203875.67211: variable 'ansible_timeout' from source: unknown 15896 1727203875.67213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203875.67216: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203875.67219: variable 'omit' from source: magic vars 15896 1727203875.67221: starting attempt loop 15896 1727203875.67223: running the handler 15896 1727203875.67225: _low_level_execute_command(): starting 15896 1727203875.67227: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203875.67927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203875.67995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203875.68051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203875.68066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203875.68077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203875.68215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203875.70008: stdout chunk (state=3): >>>/root <<< 15896 1727203875.70168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203875.70172: stdout chunk (state=3): >>><<< 15896 1727203875.70174: stderr chunk (state=3): >>><<< 15896 1727203875.70303: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203875.70307: _low_level_execute_command(): starting 15896 1727203875.70325: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269 `" && echo ansible-tmp-1727203875.702072-17850-6516937755269="` echo /root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269 `" ) && sleep 0' 15896 1727203875.70908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203875.70918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203875.70931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203875.70980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203875.70991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203875.71005: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203875.71008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203875.71011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203875.71102: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203875.71112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203875.71209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203875.73291: stdout chunk (state=3): >>>ansible-tmp-1727203875.702072-17850-6516937755269=/root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269 <<< 15896 1727203875.73462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203875.73466: stdout chunk (state=3): >>><<< 15896 1727203875.73468: stderr chunk (state=3): >>><<< 15896 1727203875.73681: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203875.702072-17850-6516937755269=/root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203875.73685: variable 'ansible_module_compression' from source: unknown 15896 1727203875.73688: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15896 1727203875.73690: variable 'ansible_facts' from source: unknown 15896 1727203875.73797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/AnsiballZ_stat.py 15896 1727203875.73998: Sending initial data 15896 1727203875.74007: Sent initial data (150 bytes) 15896 1727203875.74620: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203875.74737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203875.74741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203875.74743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203875.75067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203875.75309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203875.77067: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203875.77161: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203875.77256: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp1m2r5irg /root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/AnsiballZ_stat.py <<< 15896 1727203875.77274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/AnsiballZ_stat.py" <<< 15896 1727203875.77365: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15896 1727203875.77386: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp1m2r5irg" to remote "/root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/AnsiballZ_stat.py" <<< 15896 1727203875.78665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203875.78668: stderr chunk (state=3): >>><<< 15896 1727203875.78671: stdout chunk (state=3): >>><<< 15896 1727203875.78698: done transferring module to remote 15896 1727203875.78707: _low_level_execute_command(): starting 15896 1727203875.78712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/ /root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/AnsiballZ_stat.py && sleep 0' 15896 1727203875.79899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203875.79905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203875.80020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203875.80024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203875.80030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203875.80032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203875.80083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203875.80089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203875.80162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203875.80248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203875.82384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203875.82388: stdout chunk (state=3): >>><<< 15896 1727203875.82390: stderr chunk (state=3): >>><<< 15896 1727203875.82393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203875.82395: _low_level_execute_command(): starting 15896 1727203875.82397: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/AnsiballZ_stat.py && sleep 0' 15896 1727203875.83665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203875.83740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203875.83743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203875.83763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203875.83788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203875.83806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203875.83835: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203875.84009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203875.84022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203875.84149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203876.00724: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15896 1727203876.02455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203876.02459: stdout chunk (state=3): >>><<< 15896 1727203876.02461: stderr chunk (state=3): >>><<< 15896 1727203876.02480: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203876.02796: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203876.02800: _low_level_execute_command(): starting 15896 1727203876.02803: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203875.702072-17850-6516937755269/ > /dev/null 2>&1 && sleep 0' 15896 1727203876.03991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203876.03996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203876.03999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203876.04127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203876.04192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203876.04196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203876.04343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203876.04400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203876.06420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203876.06585: stderr chunk (state=3): >>><<< 15896 1727203876.06588: stdout chunk (state=3): >>><<< 15896 1727203876.06591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203876.06594: handler run complete 15896 1727203876.06596: attempt loop complete, returning result 15896 1727203876.06598: _execute() done 15896 1727203876.06602: dumping result to json 15896 1727203876.06883: done dumping result, returning 15896 1727203876.06886: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-fb83-b6ad-00000000062d] 15896 1727203876.06889: sending task result for task 028d2410-947f-fb83-b6ad-00000000062d 15896 1727203876.06960: done sending task result for task 028d2410-947f-fb83-b6ad-00000000062d 15896 1727203876.06965: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 15896 1727203876.07027: no more pending results, returning what we have 15896 1727203876.07030: results queue empty 15896 1727203876.07031: checking for any_errors_fatal 15896 1727203876.07040: done checking for any_errors_fatal 15896 1727203876.07041: checking for max_fail_percentage 15896 1727203876.07043: done checking for max_fail_percentage 15896 1727203876.07043: checking to see if all hosts have failed and the running result is not ok 15896 1727203876.07044: done checking to see if all hosts have failed 15896 1727203876.07044: getting the remaining hosts for this loop 15896 1727203876.07046: done getting the remaining hosts for this loop 15896 1727203876.07049: getting the next task for host managed-node1 15896 1727203876.07056: done getting next task for host managed-node1 15896 1727203876.07060: ^ task is: TASK: Set NM profile exist flag based on the profile files 15896 1727203876.07064: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203876.07068: getting variables 15896 1727203876.07069: in VariableManager get_vars() 15896 1727203876.07232: Calling all_inventory to load vars for managed-node1 15896 1727203876.07235: Calling groups_inventory to load vars for managed-node1 15896 1727203876.07237: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203876.07247: Calling all_plugins_play to load vars for managed-node1 15896 1727203876.07250: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203876.07252: Calling groups_plugins_play to load vars for managed-node1 15896 1727203876.10111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203876.12261: done with get_vars() 15896 1727203876.12284: done getting variables 15896 1727203876.12566: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:51:16 -0400 (0:00:00.477) 0:00:21.715 ***** 15896 1727203876.12664: entering _queue_task() for managed-node1/set_fact 15896 1727203876.13431: worker is 1 (out of 1 available) 15896 1727203876.13444: exiting _queue_task() for managed-node1/set_fact 15896 1727203876.13461: done queuing things up, now waiting for results queue to drain 15896 1727203876.13463: waiting for pending results... 15896 1727203876.14497: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 15896 1727203876.14602: in run() - task 028d2410-947f-fb83-b6ad-00000000062e 15896 1727203876.14700: variable 'ansible_search_path' from source: unknown 15896 1727203876.14709: variable 'ansible_search_path' from source: unknown 15896 1727203876.14750: calling self._execute() 15896 1727203876.14982: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.14996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.15010: variable 'omit' from source: magic vars 15896 1727203876.15736: variable 'ansible_distribution_major_version' from source: facts 15896 1727203876.15896: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203876.16439: variable 'profile_stat' from source: set_fact 15896 1727203876.16443: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203876.16445: when evaluation is False, skipping this task 15896 1727203876.16448: _execute() done 15896 1727203876.16450: dumping result to json 15896 1727203876.16452: done dumping result, returning 15896 1727203876.16455: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-fb83-b6ad-00000000062e] 15896 1727203876.16457: sending task result for task 028d2410-947f-fb83-b6ad-00000000062e skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203876.16595: no more pending results, returning what we have 15896 1727203876.16599: results queue empty 15896 1727203876.16599: checking for any_errors_fatal 15896 1727203876.16609: done checking for any_errors_fatal 15896 1727203876.16610: checking for max_fail_percentage 15896 1727203876.16612: done checking for max_fail_percentage 15896 1727203876.16613: checking to see if all hosts have failed and the running result is not ok 15896 1727203876.16613: done checking to see if all hosts have failed 15896 1727203876.16614: getting the remaining hosts for this loop 15896 1727203876.16616: done getting the remaining hosts for this loop 15896 1727203876.16621: getting the next task for host managed-node1 15896 1727203876.16628: done getting next task for host managed-node1 15896 1727203876.16631: ^ task is: TASK: Get NM profile info 15896 1727203876.16635: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203876.16640: getting variables 15896 1727203876.16642: in VariableManager get_vars() 15896 1727203876.16695: Calling all_inventory to load vars for managed-node1 15896 1727203876.16697: Calling groups_inventory to load vars for managed-node1 15896 1727203876.16700: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203876.16715: Calling all_plugins_play to load vars for managed-node1 15896 1727203876.16718: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203876.16721: Calling groups_plugins_play to load vars for managed-node1 15896 1727203876.17503: done sending task result for task 028d2410-947f-fb83-b6ad-00000000062e 15896 1727203876.17506: WORKER PROCESS EXITING 15896 1727203876.19771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203876.23418: done with get_vars() 15896 1727203876.23447: done getting variables 15896 1727203876.23511: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:51:16 -0400 (0:00:00.108) 0:00:21.824 ***** 15896 1727203876.23545: entering _queue_task() for managed-node1/shell 15896 1727203876.23916: worker is 1 (out of 1 available) 15896 1727203876.23929: exiting _queue_task() for managed-node1/shell 15896 1727203876.23942: done queuing things up, now waiting for results queue to drain 15896 1727203876.23944: waiting for pending results... 15896 1727203876.24244: running TaskExecutor() for managed-node1/TASK: Get NM profile info 15896 1727203876.24340: in run() - task 028d2410-947f-fb83-b6ad-00000000062f 15896 1727203876.24353: variable 'ansible_search_path' from source: unknown 15896 1727203876.24363: variable 'ansible_search_path' from source: unknown 15896 1727203876.24398: calling self._execute() 15896 1727203876.24502: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.24506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.24516: variable 'omit' from source: magic vars 15896 1727203876.24886: variable 'ansible_distribution_major_version' from source: facts 15896 1727203876.24901: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203876.24980: variable 'omit' from source: magic vars 15896 1727203876.24984: variable 'omit' from source: magic vars 15896 1727203876.25053: variable 'profile' from source: include params 15896 1727203876.25058: variable 'item' from source: include params 15896 1727203876.25125: variable 'item' from source: include params 15896 1727203876.25143: variable 'omit' from source: magic vars 15896 1727203876.25185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203876.25218: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203876.25244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203876.25264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203876.25273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203876.25304: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203876.25307: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.25310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.25412: Set connection var ansible_shell_type to sh 15896 1727203876.25419: Set connection var ansible_connection to ssh 15896 1727203876.25425: Set connection var ansible_shell_executable to /bin/sh 15896 1727203876.25435: Set connection var ansible_pipelining to False 15896 1727203876.25438: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203876.25446: Set connection var ansible_timeout to 10 15896 1727203876.25468: variable 'ansible_shell_executable' from source: unknown 15896 1727203876.25472: variable 'ansible_connection' from source: unknown 15896 1727203876.25474: variable 'ansible_module_compression' from source: unknown 15896 1727203876.25478: variable 'ansible_shell_type' from source: unknown 15896 1727203876.25480: variable 'ansible_shell_executable' from source: unknown 15896 1727203876.25483: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.25487: variable 'ansible_pipelining' from source: unknown 15896 1727203876.25490: variable 'ansible_timeout' from source: unknown 15896 1727203876.25494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.25655: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203876.25661: variable 'omit' from source: magic vars 15896 1727203876.25664: starting attempt loop 15896 1727203876.25667: running the handler 15896 1727203876.25670: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203876.25766: _low_level_execute_command(): starting 15896 1727203876.25769: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203876.26539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203876.26562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203876.26566: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203876.26589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203876.26753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203876.28526: stdout chunk (state=3): >>>/root <<< 15896 1727203876.28699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203876.28702: stdout chunk (state=3): >>><<< 15896 1727203876.28704: stderr chunk (state=3): >>><<< 15896 1727203876.28725: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203876.28744: _low_level_execute_command(): starting 15896 1727203876.28756: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650 `" && echo ansible-tmp-1727203876.2873156-17880-113605797287650="` echo /root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650 `" ) && sleep 0' 15896 1727203876.29410: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203876.29414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203876.29425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203876.29520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203876.29524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203876.29526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203876.29560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203876.29653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203876.31801: stdout chunk (state=3): >>>ansible-tmp-1727203876.2873156-17880-113605797287650=/root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650 <<< 15896 1727203876.31974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203876.31980: stdout chunk (state=3): >>><<< 15896 1727203876.31982: stderr chunk (state=3): >>><<< 15896 1727203876.32000: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203876.2873156-17880-113605797287650=/root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203876.32180: variable 'ansible_module_compression' from source: unknown 15896 1727203876.32184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203876.32186: variable 'ansible_facts' from source: unknown 15896 1727203876.32229: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/AnsiballZ_command.py 15896 1727203876.32432: Sending initial data 15896 1727203876.32435: Sent initial data (156 bytes) 15896 1727203876.33026: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203876.33078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203876.33092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203876.33102: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203876.33113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203876.33184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203876.33208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203876.33315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203876.35048: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203876.35122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203876.35224: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpx2gusca4 /root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/AnsiballZ_command.py <<< 15896 1727203876.35227: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/AnsiballZ_command.py" <<< 15896 1727203876.35304: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpx2gusca4" to remote "/root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/AnsiballZ_command.py" <<< 15896 1727203876.36371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203876.36381: stdout chunk (state=3): >>><<< 15896 1727203876.36385: stderr chunk (state=3): >>><<< 15896 1727203876.36387: done transferring module to remote 15896 1727203876.36389: _low_level_execute_command(): starting 15896 1727203876.36392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/ /root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/AnsiballZ_command.py && sleep 0' 15896 1727203876.36975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203876.36991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203876.37028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203876.37134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203876.37163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203876.37179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203876.37293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203876.39320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203876.39338: stderr chunk (state=3): >>><<< 15896 1727203876.39346: stdout chunk (state=3): >>><<< 15896 1727203876.39369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203876.39378: _low_level_execute_command(): starting 15896 1727203876.39389: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/AnsiballZ_command.py && sleep 0' 15896 1727203876.40013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203876.40035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203876.40050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203876.40069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203876.40089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203876.40143: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203876.40202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203876.40222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203876.40250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203876.40383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203876.59395: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:51:16.568132", "end": "2024-09-24 14:51:16.590361", "delta": "0:00:00.022229", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203876.61043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203876.61099: stderr chunk (state=3): >>><<< 15896 1727203876.61102: stdout chunk (state=3): >>><<< 15896 1727203876.61124: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:51:16.568132", "end": "2024-09-24 14:51:16.590361", "delta": "0:00:00.022229", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203876.61167: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203876.61178: _low_level_execute_command(): starting 15896 1727203876.61181: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203876.2873156-17880-113605797287650/ > /dev/null 2>&1 && sleep 0' 15896 1727203876.61842: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203876.61845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203876.61847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203876.61849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203876.62080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203876.62109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203876.62122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203876.62228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203876.64192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203876.64247: stderr chunk (state=3): >>><<< 15896 1727203876.64480: stdout chunk (state=3): >>><<< 15896 1727203876.64483: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203876.64485: handler run complete 15896 1727203876.64487: Evaluated conditional (False): False 15896 1727203876.64489: attempt loop complete, returning result 15896 1727203876.64490: _execute() done 15896 1727203876.64492: dumping result to json 15896 1727203876.64493: done dumping result, returning 15896 1727203876.64495: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-fb83-b6ad-00000000062f] 15896 1727203876.64496: sending task result for task 028d2410-947f-fb83-b6ad-00000000062f 15896 1727203876.64558: done sending task result for task 028d2410-947f-fb83-b6ad-00000000062f 15896 1727203876.64563: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.022229", "end": "2024-09-24 14:51:16.590361", "rc": 0, "start": "2024-09-24 14:51:16.568132" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 15896 1727203876.64652: no more pending results, returning what we have 15896 1727203876.64657: results queue empty 15896 1727203876.64658: checking for any_errors_fatal 15896 1727203876.64664: done checking for any_errors_fatal 15896 1727203876.64665: checking for max_fail_percentage 15896 1727203876.64667: done checking for max_fail_percentage 15896 1727203876.64667: checking to see if all hosts have failed and the running result is not ok 15896 1727203876.64668: done checking to see if all hosts have failed 15896 1727203876.64669: getting the remaining hosts for this loop 15896 1727203876.64671: done getting the remaining hosts for this loop 15896 1727203876.64674: getting the next task for host managed-node1 15896 1727203876.64788: done getting next task for host managed-node1 15896 1727203876.64791: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15896 1727203876.64795: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203876.64799: getting variables 15896 1727203876.64801: in VariableManager get_vars() 15896 1727203876.64848: Calling all_inventory to load vars for managed-node1 15896 1727203876.64851: Calling groups_inventory to load vars for managed-node1 15896 1727203876.64853: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203876.64862: Calling all_plugins_play to load vars for managed-node1 15896 1727203876.64865: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203876.64868: Calling groups_plugins_play to load vars for managed-node1 15896 1727203876.66539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203876.68170: done with get_vars() 15896 1727203876.68193: done getting variables 15896 1727203876.68251: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:51:16 -0400 (0:00:00.447) 0:00:22.272 ***** 15896 1727203876.68296: entering _queue_task() for managed-node1/set_fact 15896 1727203876.68646: worker is 1 (out of 1 available) 15896 1727203876.68661: exiting _queue_task() for managed-node1/set_fact 15896 1727203876.68674: done queuing things up, now waiting for results queue to drain 15896 1727203876.68677: waiting for pending results... 15896 1727203876.68967: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15896 1727203876.69081: in run() - task 028d2410-947f-fb83-b6ad-000000000630 15896 1727203876.69095: variable 'ansible_search_path' from source: unknown 15896 1727203876.69108: variable 'ansible_search_path' from source: unknown 15896 1727203876.69144: calling self._execute() 15896 1727203876.69253: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.69258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.69268: variable 'omit' from source: magic vars 15896 1727203876.69667: variable 'ansible_distribution_major_version' from source: facts 15896 1727203876.69679: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203876.69812: variable 'nm_profile_exists' from source: set_fact 15896 1727203876.69827: Evaluated conditional (nm_profile_exists.rc == 0): True 15896 1727203876.69834: variable 'omit' from source: magic vars 15896 1727203876.69895: variable 'omit' from source: magic vars 15896 1727203876.69925: variable 'omit' from source: magic vars 15896 1727203876.69967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203876.70181: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203876.70184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203876.70187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203876.70189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203876.70195: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203876.70197: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.70200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.70202: Set connection var ansible_shell_type to sh 15896 1727203876.70204: Set connection var ansible_connection to ssh 15896 1727203876.70208: Set connection var ansible_shell_executable to /bin/sh 15896 1727203876.70214: Set connection var ansible_pipelining to False 15896 1727203876.70219: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203876.70224: Set connection var ansible_timeout to 10 15896 1727203876.70246: variable 'ansible_shell_executable' from source: unknown 15896 1727203876.70249: variable 'ansible_connection' from source: unknown 15896 1727203876.70251: variable 'ansible_module_compression' from source: unknown 15896 1727203876.70255: variable 'ansible_shell_type' from source: unknown 15896 1727203876.70257: variable 'ansible_shell_executable' from source: unknown 15896 1727203876.70262: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.70264: variable 'ansible_pipelining' from source: unknown 15896 1727203876.70268: variable 'ansible_timeout' from source: unknown 15896 1727203876.70270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.70423: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203876.70432: variable 'omit' from source: magic vars 15896 1727203876.70438: starting attempt loop 15896 1727203876.70441: running the handler 15896 1727203876.70453: handler run complete 15896 1727203876.70465: attempt loop complete, returning result 15896 1727203876.70468: _execute() done 15896 1727203876.70470: dumping result to json 15896 1727203876.70472: done dumping result, returning 15896 1727203876.70480: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-fb83-b6ad-000000000630] 15896 1727203876.70485: sending task result for task 028d2410-947f-fb83-b6ad-000000000630 15896 1727203876.70573: done sending task result for task 028d2410-947f-fb83-b6ad-000000000630 15896 1727203876.70782: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15896 1727203876.70827: no more pending results, returning what we have 15896 1727203876.70830: results queue empty 15896 1727203876.70831: checking for any_errors_fatal 15896 1727203876.70836: done checking for any_errors_fatal 15896 1727203876.70839: checking for max_fail_percentage 15896 1727203876.70841: done checking for max_fail_percentage 15896 1727203876.70841: checking to see if all hosts have failed and the running result is not ok 15896 1727203876.70842: done checking to see if all hosts have failed 15896 1727203876.70843: getting the remaining hosts for this loop 15896 1727203876.70844: done getting the remaining hosts for this loop 15896 1727203876.70847: getting the next task for host managed-node1 15896 1727203876.70854: done getting next task for host managed-node1 15896 1727203876.70856: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15896 1727203876.70860: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203876.70864: getting variables 15896 1727203876.70865: in VariableManager get_vars() 15896 1727203876.70909: Calling all_inventory to load vars for managed-node1 15896 1727203876.70912: Calling groups_inventory to load vars for managed-node1 15896 1727203876.70915: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203876.70923: Calling all_plugins_play to load vars for managed-node1 15896 1727203876.70927: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203876.70930: Calling groups_plugins_play to load vars for managed-node1 15896 1727203876.72342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203876.74017: done with get_vars() 15896 1727203876.74045: done getting variables 15896 1727203876.74114: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203876.74239: variable 'profile' from source: include params 15896 1727203876.74244: variable 'item' from source: include params 15896 1727203876.74313: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:51:16 -0400 (0:00:00.060) 0:00:22.332 ***** 15896 1727203876.74350: entering _queue_task() for managed-node1/command 15896 1727203876.74750: worker is 1 (out of 1 available) 15896 1727203876.74763: exiting _queue_task() for managed-node1/command 15896 1727203876.74777: done queuing things up, now waiting for results queue to drain 15896 1727203876.74779: waiting for pending results... 15896 1727203876.75033: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 15896 1727203876.75151: in run() - task 028d2410-947f-fb83-b6ad-000000000632 15896 1727203876.75178: variable 'ansible_search_path' from source: unknown 15896 1727203876.75182: variable 'ansible_search_path' from source: unknown 15896 1727203876.75214: calling self._execute() 15896 1727203876.75325: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.75331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.75342: variable 'omit' from source: magic vars 15896 1727203876.75733: variable 'ansible_distribution_major_version' from source: facts 15896 1727203876.75745: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203876.75874: variable 'profile_stat' from source: set_fact 15896 1727203876.75889: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203876.75892: when evaluation is False, skipping this task 15896 1727203876.75895: _execute() done 15896 1727203876.75897: dumping result to json 15896 1727203876.75899: done dumping result, returning 15896 1727203876.75907: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [028d2410-947f-fb83-b6ad-000000000632] 15896 1727203876.75914: sending task result for task 028d2410-947f-fb83-b6ad-000000000632 15896 1727203876.76010: done sending task result for task 028d2410-947f-fb83-b6ad-000000000632 15896 1727203876.76012: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203876.76078: no more pending results, returning what we have 15896 1727203876.76082: results queue empty 15896 1727203876.76083: checking for any_errors_fatal 15896 1727203876.76090: done checking for any_errors_fatal 15896 1727203876.76090: checking for max_fail_percentage 15896 1727203876.76092: done checking for max_fail_percentage 15896 1727203876.76093: checking to see if all hosts have failed and the running result is not ok 15896 1727203876.76094: done checking to see if all hosts have failed 15896 1727203876.76095: getting the remaining hosts for this loop 15896 1727203876.76097: done getting the remaining hosts for this loop 15896 1727203876.76100: getting the next task for host managed-node1 15896 1727203876.76108: done getting next task for host managed-node1 15896 1727203876.76111: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15896 1727203876.76116: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203876.76122: getting variables 15896 1727203876.76123: in VariableManager get_vars() 15896 1727203876.76296: Calling all_inventory to load vars for managed-node1 15896 1727203876.76299: Calling groups_inventory to load vars for managed-node1 15896 1727203876.76302: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203876.76313: Calling all_plugins_play to load vars for managed-node1 15896 1727203876.76316: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203876.76318: Calling groups_plugins_play to load vars for managed-node1 15896 1727203876.78001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203876.79689: done with get_vars() 15896 1727203876.79713: done getting variables 15896 1727203876.79783: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203876.80054: variable 'profile' from source: include params 15896 1727203876.80176: variable 'item' from source: include params 15896 1727203876.80241: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:51:16 -0400 (0:00:00.059) 0:00:22.392 ***** 15896 1727203876.80274: entering _queue_task() for managed-node1/set_fact 15896 1727203876.81087: worker is 1 (out of 1 available) 15896 1727203876.81099: exiting _queue_task() for managed-node1/set_fact 15896 1727203876.81118: done queuing things up, now waiting for results queue to drain 15896 1727203876.81120: waiting for pending results... 15896 1727203876.81794: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 15896 1727203876.81800: in run() - task 028d2410-947f-fb83-b6ad-000000000633 15896 1727203876.81802: variable 'ansible_search_path' from source: unknown 15896 1727203876.81805: variable 'ansible_search_path' from source: unknown 15896 1727203876.81808: calling self._execute() 15896 1727203876.81913: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.81917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.81929: variable 'omit' from source: magic vars 15896 1727203876.82324: variable 'ansible_distribution_major_version' from source: facts 15896 1727203876.82336: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203876.82464: variable 'profile_stat' from source: set_fact 15896 1727203876.82474: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203876.82484: when evaluation is False, skipping this task 15896 1727203876.82487: _execute() done 15896 1727203876.82492: dumping result to json 15896 1727203876.82494: done dumping result, returning 15896 1727203876.82502: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [028d2410-947f-fb83-b6ad-000000000633] 15896 1727203876.82507: sending task result for task 028d2410-947f-fb83-b6ad-000000000633 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203876.82767: no more pending results, returning what we have 15896 1727203876.82771: results queue empty 15896 1727203876.82772: checking for any_errors_fatal 15896 1727203876.82780: done checking for any_errors_fatal 15896 1727203876.82781: checking for max_fail_percentage 15896 1727203876.82782: done checking for max_fail_percentage 15896 1727203876.82783: checking to see if all hosts have failed and the running result is not ok 15896 1727203876.82784: done checking to see if all hosts have failed 15896 1727203876.82785: getting the remaining hosts for this loop 15896 1727203876.82787: done getting the remaining hosts for this loop 15896 1727203876.82790: getting the next task for host managed-node1 15896 1727203876.82796: done getting next task for host managed-node1 15896 1727203876.82798: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15896 1727203876.82802: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203876.82807: getting variables 15896 1727203876.82808: in VariableManager get_vars() 15896 1727203876.82893: Calling all_inventory to load vars for managed-node1 15896 1727203876.82896: Calling groups_inventory to load vars for managed-node1 15896 1727203876.82898: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203876.82912: Calling all_plugins_play to load vars for managed-node1 15896 1727203876.82915: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203876.82918: Calling groups_plugins_play to load vars for managed-node1 15896 1727203876.83537: done sending task result for task 028d2410-947f-fb83-b6ad-000000000633 15896 1727203876.83541: WORKER PROCESS EXITING 15896 1727203876.85004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203876.86672: done with get_vars() 15896 1727203876.86709: done getting variables 15896 1727203876.86771: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203876.86901: variable 'profile' from source: include params 15896 1727203876.86909: variable 'item' from source: include params 15896 1727203876.86981: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:51:16 -0400 (0:00:00.067) 0:00:22.459 ***** 15896 1727203876.87011: entering _queue_task() for managed-node1/command 15896 1727203876.87404: worker is 1 (out of 1 available) 15896 1727203876.87417: exiting _queue_task() for managed-node1/command 15896 1727203876.87429: done queuing things up, now waiting for results queue to drain 15896 1727203876.87431: waiting for pending results... 15896 1727203876.87639: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 15896 1727203876.87730: in run() - task 028d2410-947f-fb83-b6ad-000000000634 15896 1727203876.87747: variable 'ansible_search_path' from source: unknown 15896 1727203876.87751: variable 'ansible_search_path' from source: unknown 15896 1727203876.87783: calling self._execute() 15896 1727203876.87963: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.88067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.88071: variable 'omit' from source: magic vars 15896 1727203876.88620: variable 'ansible_distribution_major_version' from source: facts 15896 1727203876.88638: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203876.88794: variable 'profile_stat' from source: set_fact 15896 1727203876.88813: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203876.88842: when evaluation is False, skipping this task 15896 1727203876.88850: _execute() done 15896 1727203876.88857: dumping result to json 15896 1727203876.88867: done dumping result, returning 15896 1727203876.88882: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [028d2410-947f-fb83-b6ad-000000000634] 15896 1727203876.88956: sending task result for task 028d2410-947f-fb83-b6ad-000000000634 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203876.89135: no more pending results, returning what we have 15896 1727203876.89139: results queue empty 15896 1727203876.89140: checking for any_errors_fatal 15896 1727203876.89149: done checking for any_errors_fatal 15896 1727203876.89150: checking for max_fail_percentage 15896 1727203876.89152: done checking for max_fail_percentage 15896 1727203876.89153: checking to see if all hosts have failed and the running result is not ok 15896 1727203876.89153: done checking to see if all hosts have failed 15896 1727203876.89154: getting the remaining hosts for this loop 15896 1727203876.89156: done getting the remaining hosts for this loop 15896 1727203876.89159: getting the next task for host managed-node1 15896 1727203876.89271: done getting next task for host managed-node1 15896 1727203876.89293: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15896 1727203876.89317: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203876.89460: getting variables 15896 1727203876.89463: in VariableManager get_vars() 15896 1727203876.89538: Calling all_inventory to load vars for managed-node1 15896 1727203876.89540: Calling groups_inventory to load vars for managed-node1 15896 1727203876.89543: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203876.89549: done sending task result for task 028d2410-947f-fb83-b6ad-000000000634 15896 1727203876.89551: WORKER PROCESS EXITING 15896 1727203876.89559: Calling all_plugins_play to load vars for managed-node1 15896 1727203876.89561: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203876.89564: Calling groups_plugins_play to load vars for managed-node1 15896 1727203876.90419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203876.91473: done with get_vars() 15896 1727203876.91495: done getting variables 15896 1727203876.91632: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203876.91735: variable 'profile' from source: include params 15896 1727203876.91739: variable 'item' from source: include params 15896 1727203876.92036: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:51:16 -0400 (0:00:00.051) 0:00:22.510 ***** 15896 1727203876.92116: entering _queue_task() for managed-node1/set_fact 15896 1727203876.92559: worker is 1 (out of 1 available) 15896 1727203876.92572: exiting _queue_task() for managed-node1/set_fact 15896 1727203876.92591: done queuing things up, now waiting for results queue to drain 15896 1727203876.92593: waiting for pending results... 15896 1727203876.92923: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 15896 1727203876.93040: in run() - task 028d2410-947f-fb83-b6ad-000000000635 15896 1727203876.93052: variable 'ansible_search_path' from source: unknown 15896 1727203876.93057: variable 'ansible_search_path' from source: unknown 15896 1727203876.93090: calling self._execute() 15896 1727203876.93170: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.93174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.93184: variable 'omit' from source: magic vars 15896 1727203876.93450: variable 'ansible_distribution_major_version' from source: facts 15896 1727203876.93463: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203876.93547: variable 'profile_stat' from source: set_fact 15896 1727203876.93557: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203876.93560: when evaluation is False, skipping this task 15896 1727203876.93564: _execute() done 15896 1727203876.93569: dumping result to json 15896 1727203876.93572: done dumping result, returning 15896 1727203876.93581: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [028d2410-947f-fb83-b6ad-000000000635] 15896 1727203876.93587: sending task result for task 028d2410-947f-fb83-b6ad-000000000635 15896 1727203876.93666: done sending task result for task 028d2410-947f-fb83-b6ad-000000000635 15896 1727203876.93668: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203876.93719: no more pending results, returning what we have 15896 1727203876.93723: results queue empty 15896 1727203876.93724: checking for any_errors_fatal 15896 1727203876.93731: done checking for any_errors_fatal 15896 1727203876.93732: checking for max_fail_percentage 15896 1727203876.93733: done checking for max_fail_percentage 15896 1727203876.93734: checking to see if all hosts have failed and the running result is not ok 15896 1727203876.93735: done checking to see if all hosts have failed 15896 1727203876.93735: getting the remaining hosts for this loop 15896 1727203876.93737: done getting the remaining hosts for this loop 15896 1727203876.93740: getting the next task for host managed-node1 15896 1727203876.93748: done getting next task for host managed-node1 15896 1727203876.93751: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15896 1727203876.93754: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203876.93758: getting variables 15896 1727203876.93759: in VariableManager get_vars() 15896 1727203876.93814: Calling all_inventory to load vars for managed-node1 15896 1727203876.93817: Calling groups_inventory to load vars for managed-node1 15896 1727203876.93819: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203876.93830: Calling all_plugins_play to load vars for managed-node1 15896 1727203876.93832: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203876.93834: Calling groups_plugins_play to load vars for managed-node1 15896 1727203876.94640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203876.96217: done with get_vars() 15896 1727203876.96238: done getting variables 15896 1727203876.96300: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203876.96411: variable 'profile' from source: include params 15896 1727203876.96414: variable 'item' from source: include params 15896 1727203876.96452: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:51:16 -0400 (0:00:00.043) 0:00:22.554 ***** 15896 1727203876.96477: entering _queue_task() for managed-node1/assert 15896 1727203876.96708: worker is 1 (out of 1 available) 15896 1727203876.96722: exiting _queue_task() for managed-node1/assert 15896 1727203876.96734: done queuing things up, now waiting for results queue to drain 15896 1727203876.96736: waiting for pending results... 15896 1727203876.96906: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' 15896 1727203876.96977: in run() - task 028d2410-947f-fb83-b6ad-00000000035d 15896 1727203876.96990: variable 'ansible_search_path' from source: unknown 15896 1727203876.96993: variable 'ansible_search_path' from source: unknown 15896 1727203876.97021: calling self._execute() 15896 1727203876.97102: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.97106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.97116: variable 'omit' from source: magic vars 15896 1727203876.97380: variable 'ansible_distribution_major_version' from source: facts 15896 1727203876.97389: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203876.97401: variable 'omit' from source: magic vars 15896 1727203876.97423: variable 'omit' from source: magic vars 15896 1727203876.97494: variable 'profile' from source: include params 15896 1727203876.97497: variable 'item' from source: include params 15896 1727203876.97545: variable 'item' from source: include params 15896 1727203876.97558: variable 'omit' from source: magic vars 15896 1727203876.97596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203876.97624: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203876.97639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203876.97652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203876.97661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203876.97688: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203876.97692: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.97694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.97761: Set connection var ansible_shell_type to sh 15896 1727203876.97770: Set connection var ansible_connection to ssh 15896 1727203876.97777: Set connection var ansible_shell_executable to /bin/sh 15896 1727203876.97782: Set connection var ansible_pipelining to False 15896 1727203876.97787: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203876.97792: Set connection var ansible_timeout to 10 15896 1727203876.97809: variable 'ansible_shell_executable' from source: unknown 15896 1727203876.97811: variable 'ansible_connection' from source: unknown 15896 1727203876.97814: variable 'ansible_module_compression' from source: unknown 15896 1727203876.97816: variable 'ansible_shell_type' from source: unknown 15896 1727203876.97818: variable 'ansible_shell_executable' from source: unknown 15896 1727203876.97820: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203876.97824: variable 'ansible_pipelining' from source: unknown 15896 1727203876.97829: variable 'ansible_timeout' from source: unknown 15896 1727203876.97831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203876.97931: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203876.97939: variable 'omit' from source: magic vars 15896 1727203876.97944: starting attempt loop 15896 1727203876.97947: running the handler 15896 1727203876.98024: variable 'lsr_net_profile_exists' from source: set_fact 15896 1727203876.98027: Evaluated conditional (lsr_net_profile_exists): True 15896 1727203876.98066: handler run complete 15896 1727203876.98070: attempt loop complete, returning result 15896 1727203876.98072: _execute() done 15896 1727203876.98083: dumping result to json 15896 1727203876.98090: done dumping result, returning 15896 1727203876.98092: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' [028d2410-947f-fb83-b6ad-00000000035d] 15896 1727203876.98095: sending task result for task 028d2410-947f-fb83-b6ad-00000000035d 15896 1727203876.98224: done sending task result for task 028d2410-947f-fb83-b6ad-00000000035d 15896 1727203876.98227: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203876.98353: no more pending results, returning what we have 15896 1727203876.98355: results queue empty 15896 1727203876.98356: checking for any_errors_fatal 15896 1727203876.98361: done checking for any_errors_fatal 15896 1727203876.98362: checking for max_fail_percentage 15896 1727203876.98363: done checking for max_fail_percentage 15896 1727203876.98364: checking to see if all hosts have failed and the running result is not ok 15896 1727203876.98365: done checking to see if all hosts have failed 15896 1727203876.98365: getting the remaining hosts for this loop 15896 1727203876.98366: done getting the remaining hosts for this loop 15896 1727203876.98369: getting the next task for host managed-node1 15896 1727203876.98374: done getting next task for host managed-node1 15896 1727203876.98377: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15896 1727203876.98380: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203876.98385: getting variables 15896 1727203876.98386: in VariableManager get_vars() 15896 1727203876.98433: Calling all_inventory to load vars for managed-node1 15896 1727203876.98436: Calling groups_inventory to load vars for managed-node1 15896 1727203876.98438: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203876.98447: Calling all_plugins_play to load vars for managed-node1 15896 1727203876.98450: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203876.98453: Calling groups_plugins_play to load vars for managed-node1 15896 1727203876.99548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203877.00404: done with get_vars() 15896 1727203877.00420: done getting variables 15896 1727203877.00464: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203877.00541: variable 'profile' from source: include params 15896 1727203877.00544: variable 'item' from source: include params 15896 1727203877.00587: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:51:17 -0400 (0:00:00.041) 0:00:22.595 ***** 15896 1727203877.00612: entering _queue_task() for managed-node1/assert 15896 1727203877.00833: worker is 1 (out of 1 available) 15896 1727203877.00845: exiting _queue_task() for managed-node1/assert 15896 1727203877.00857: done queuing things up, now waiting for results queue to drain 15896 1727203877.00859: waiting for pending results... 15896 1727203877.01031: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 15896 1727203877.01103: in run() - task 028d2410-947f-fb83-b6ad-00000000035e 15896 1727203877.01112: variable 'ansible_search_path' from source: unknown 15896 1727203877.01116: variable 'ansible_search_path' from source: unknown 15896 1727203877.01142: calling self._execute() 15896 1727203877.01221: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.01225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.01234: variable 'omit' from source: magic vars 15896 1727203877.01506: variable 'ansible_distribution_major_version' from source: facts 15896 1727203877.01516: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203877.01522: variable 'omit' from source: magic vars 15896 1727203877.01552: variable 'omit' from source: magic vars 15896 1727203877.01625: variable 'profile' from source: include params 15896 1727203877.01629: variable 'item' from source: include params 15896 1727203877.01678: variable 'item' from source: include params 15896 1727203877.01692: variable 'omit' from source: magic vars 15896 1727203877.01724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203877.01752: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203877.01767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203877.01783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.01792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.01816: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203877.01819: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.01821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.01895: Set connection var ansible_shell_type to sh 15896 1727203877.01901: Set connection var ansible_connection to ssh 15896 1727203877.01906: Set connection var ansible_shell_executable to /bin/sh 15896 1727203877.01911: Set connection var ansible_pipelining to False 15896 1727203877.01916: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203877.01921: Set connection var ansible_timeout to 10 15896 1727203877.01937: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.01940: variable 'ansible_connection' from source: unknown 15896 1727203877.01942: variable 'ansible_module_compression' from source: unknown 15896 1727203877.01946: variable 'ansible_shell_type' from source: unknown 15896 1727203877.01948: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.01950: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.01952: variable 'ansible_pipelining' from source: unknown 15896 1727203877.01955: variable 'ansible_timeout' from source: unknown 15896 1727203877.01959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.02060: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203877.02072: variable 'omit' from source: magic vars 15896 1727203877.02082: starting attempt loop 15896 1727203877.02086: running the handler 15896 1727203877.02155: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15896 1727203877.02159: Evaluated conditional (lsr_net_profile_ansible_managed): True 15896 1727203877.02167: handler run complete 15896 1727203877.02182: attempt loop complete, returning result 15896 1727203877.02186: _execute() done 15896 1727203877.02189: dumping result to json 15896 1727203877.02191: done dumping result, returning 15896 1727203877.02200: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [028d2410-947f-fb83-b6ad-00000000035e] 15896 1727203877.02202: sending task result for task 028d2410-947f-fb83-b6ad-00000000035e 15896 1727203877.02279: done sending task result for task 028d2410-947f-fb83-b6ad-00000000035e 15896 1727203877.02282: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203877.02334: no more pending results, returning what we have 15896 1727203877.02337: results queue empty 15896 1727203877.02338: checking for any_errors_fatal 15896 1727203877.02346: done checking for any_errors_fatal 15896 1727203877.02346: checking for max_fail_percentage 15896 1727203877.02348: done checking for max_fail_percentage 15896 1727203877.02348: checking to see if all hosts have failed and the running result is not ok 15896 1727203877.02349: done checking to see if all hosts have failed 15896 1727203877.02350: getting the remaining hosts for this loop 15896 1727203877.02351: done getting the remaining hosts for this loop 15896 1727203877.02354: getting the next task for host managed-node1 15896 1727203877.02359: done getting next task for host managed-node1 15896 1727203877.02361: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15896 1727203877.02364: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203877.02368: getting variables 15896 1727203877.02369: in VariableManager get_vars() 15896 1727203877.02424: Calling all_inventory to load vars for managed-node1 15896 1727203877.02426: Calling groups_inventory to load vars for managed-node1 15896 1727203877.02429: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203877.02437: Calling all_plugins_play to load vars for managed-node1 15896 1727203877.02440: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203877.02442: Calling groups_plugins_play to load vars for managed-node1 15896 1727203877.03329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203877.04170: done with get_vars() 15896 1727203877.04187: done getting variables 15896 1727203877.04225: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203877.04302: variable 'profile' from source: include params 15896 1727203877.04305: variable 'item' from source: include params 15896 1727203877.04344: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:51:17 -0400 (0:00:00.037) 0:00:22.633 ***** 15896 1727203877.04372: entering _queue_task() for managed-node1/assert 15896 1727203877.04608: worker is 1 (out of 1 available) 15896 1727203877.04620: exiting _queue_task() for managed-node1/assert 15896 1727203877.04632: done queuing things up, now waiting for results queue to drain 15896 1727203877.04634: waiting for pending results... 15896 1727203877.04806: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 15896 1727203877.04878: in run() - task 028d2410-947f-fb83-b6ad-00000000035f 15896 1727203877.04892: variable 'ansible_search_path' from source: unknown 15896 1727203877.04896: variable 'ansible_search_path' from source: unknown 15896 1727203877.04922: calling self._execute() 15896 1727203877.05000: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.05004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.05014: variable 'omit' from source: magic vars 15896 1727203877.05276: variable 'ansible_distribution_major_version' from source: facts 15896 1727203877.05286: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203877.05292: variable 'omit' from source: magic vars 15896 1727203877.05324: variable 'omit' from source: magic vars 15896 1727203877.05394: variable 'profile' from source: include params 15896 1727203877.05397: variable 'item' from source: include params 15896 1727203877.05444: variable 'item' from source: include params 15896 1727203877.05460: variable 'omit' from source: magic vars 15896 1727203877.05495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203877.05523: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203877.05539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203877.05551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.05560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.05588: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203877.05591: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.05594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.05660: Set connection var ansible_shell_type to sh 15896 1727203877.05669: Set connection var ansible_connection to ssh 15896 1727203877.05674: Set connection var ansible_shell_executable to /bin/sh 15896 1727203877.05681: Set connection var ansible_pipelining to False 15896 1727203877.05686: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203877.05691: Set connection var ansible_timeout to 10 15896 1727203877.05707: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.05710: variable 'ansible_connection' from source: unknown 15896 1727203877.05713: variable 'ansible_module_compression' from source: unknown 15896 1727203877.05715: variable 'ansible_shell_type' from source: unknown 15896 1727203877.05717: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.05719: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.05722: variable 'ansible_pipelining' from source: unknown 15896 1727203877.05725: variable 'ansible_timeout' from source: unknown 15896 1727203877.05731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.05829: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203877.05840: variable 'omit' from source: magic vars 15896 1727203877.05843: starting attempt loop 15896 1727203877.05846: running the handler 15896 1727203877.05922: variable 'lsr_net_profile_fingerprint' from source: set_fact 15896 1727203877.05925: Evaluated conditional (lsr_net_profile_fingerprint): True 15896 1727203877.05931: handler run complete 15896 1727203877.05943: attempt loop complete, returning result 15896 1727203877.05946: _execute() done 15896 1727203877.05949: dumping result to json 15896 1727203877.05951: done dumping result, returning 15896 1727203877.05963: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 [028d2410-947f-fb83-b6ad-00000000035f] 15896 1727203877.05966: sending task result for task 028d2410-947f-fb83-b6ad-00000000035f 15896 1727203877.06041: done sending task result for task 028d2410-947f-fb83-b6ad-00000000035f 15896 1727203877.06044: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203877.06119: no more pending results, returning what we have 15896 1727203877.06123: results queue empty 15896 1727203877.06124: checking for any_errors_fatal 15896 1727203877.06131: done checking for any_errors_fatal 15896 1727203877.06131: checking for max_fail_percentage 15896 1727203877.06133: done checking for max_fail_percentage 15896 1727203877.06134: checking to see if all hosts have failed and the running result is not ok 15896 1727203877.06134: done checking to see if all hosts have failed 15896 1727203877.06135: getting the remaining hosts for this loop 15896 1727203877.06136: done getting the remaining hosts for this loop 15896 1727203877.06139: getting the next task for host managed-node1 15896 1727203877.06147: done getting next task for host managed-node1 15896 1727203877.06150: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15896 1727203877.06152: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203877.06155: getting variables 15896 1727203877.06156: in VariableManager get_vars() 15896 1727203877.06207: Calling all_inventory to load vars for managed-node1 15896 1727203877.06209: Calling groups_inventory to load vars for managed-node1 15896 1727203877.06211: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203877.06219: Calling all_plugins_play to load vars for managed-node1 15896 1727203877.06221: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203877.06224: Calling groups_plugins_play to load vars for managed-node1 15896 1727203877.06970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203877.07826: done with get_vars() 15896 1727203877.07840: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:51:17 -0400 (0:00:00.035) 0:00:22.668 ***** 15896 1727203877.07904: entering _queue_task() for managed-node1/include_tasks 15896 1727203877.08119: worker is 1 (out of 1 available) 15896 1727203877.08133: exiting _queue_task() for managed-node1/include_tasks 15896 1727203877.08145: done queuing things up, now waiting for results queue to drain 15896 1727203877.08147: waiting for pending results... 15896 1727203877.08312: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 15896 1727203877.08392: in run() - task 028d2410-947f-fb83-b6ad-000000000363 15896 1727203877.08404: variable 'ansible_search_path' from source: unknown 15896 1727203877.08407: variable 'ansible_search_path' from source: unknown 15896 1727203877.08433: calling self._execute() 15896 1727203877.08509: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.08513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.08522: variable 'omit' from source: magic vars 15896 1727203877.08786: variable 'ansible_distribution_major_version' from source: facts 15896 1727203877.08795: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203877.08803: _execute() done 15896 1727203877.08806: dumping result to json 15896 1727203877.08809: done dumping result, returning 15896 1727203877.08818: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-fb83-b6ad-000000000363] 15896 1727203877.08821: sending task result for task 028d2410-947f-fb83-b6ad-000000000363 15896 1727203877.08905: done sending task result for task 028d2410-947f-fb83-b6ad-000000000363 15896 1727203877.08908: WORKER PROCESS EXITING 15896 1727203877.08947: no more pending results, returning what we have 15896 1727203877.08952: in VariableManager get_vars() 15896 1727203877.09005: Calling all_inventory to load vars for managed-node1 15896 1727203877.09008: Calling groups_inventory to load vars for managed-node1 15896 1727203877.09009: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203877.09019: Calling all_plugins_play to load vars for managed-node1 15896 1727203877.09021: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203877.09024: Calling groups_plugins_play to load vars for managed-node1 15896 1727203877.09884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203877.10725: done with get_vars() 15896 1727203877.10738: variable 'ansible_search_path' from source: unknown 15896 1727203877.10739: variable 'ansible_search_path' from source: unknown 15896 1727203877.10765: we have included files to process 15896 1727203877.10765: generating all_blocks data 15896 1727203877.10767: done generating all_blocks data 15896 1727203877.10769: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15896 1727203877.10770: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15896 1727203877.10771: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15896 1727203877.11343: done processing included file 15896 1727203877.11345: iterating over new_blocks loaded from include file 15896 1727203877.11346: in VariableManager get_vars() 15896 1727203877.11366: done with get_vars() 15896 1727203877.11368: filtering new block on tags 15896 1727203877.11384: done filtering new block on tags 15896 1727203877.11386: in VariableManager get_vars() 15896 1727203877.11401: done with get_vars() 15896 1727203877.11402: filtering new block on tags 15896 1727203877.11414: done filtering new block on tags 15896 1727203877.11415: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 15896 1727203877.11419: extending task lists for all hosts with included blocks 15896 1727203877.11521: done extending task lists 15896 1727203877.11522: done processing included files 15896 1727203877.11522: results queue empty 15896 1727203877.11523: checking for any_errors_fatal 15896 1727203877.11524: done checking for any_errors_fatal 15896 1727203877.11525: checking for max_fail_percentage 15896 1727203877.11525: done checking for max_fail_percentage 15896 1727203877.11526: checking to see if all hosts have failed and the running result is not ok 15896 1727203877.11526: done checking to see if all hosts have failed 15896 1727203877.11527: getting the remaining hosts for this loop 15896 1727203877.11528: done getting the remaining hosts for this loop 15896 1727203877.11529: getting the next task for host managed-node1 15896 1727203877.11532: done getting next task for host managed-node1 15896 1727203877.11533: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15896 1727203877.11535: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203877.11537: getting variables 15896 1727203877.11537: in VariableManager get_vars() 15896 1727203877.11548: Calling all_inventory to load vars for managed-node1 15896 1727203877.11550: Calling groups_inventory to load vars for managed-node1 15896 1727203877.11551: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203877.11554: Calling all_plugins_play to load vars for managed-node1 15896 1727203877.11556: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203877.11557: Calling groups_plugins_play to load vars for managed-node1 15896 1727203877.12193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203877.13073: done with get_vars() 15896 1727203877.13088: done getting variables 15896 1727203877.13116: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:51:17 -0400 (0:00:00.052) 0:00:22.720 ***** 15896 1727203877.13137: entering _queue_task() for managed-node1/set_fact 15896 1727203877.13378: worker is 1 (out of 1 available) 15896 1727203877.13389: exiting _queue_task() for managed-node1/set_fact 15896 1727203877.13401: done queuing things up, now waiting for results queue to drain 15896 1727203877.13402: waiting for pending results... 15896 1727203877.13578: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15896 1727203877.13654: in run() - task 028d2410-947f-fb83-b6ad-000000000674 15896 1727203877.13669: variable 'ansible_search_path' from source: unknown 15896 1727203877.13673: variable 'ansible_search_path' from source: unknown 15896 1727203877.13701: calling self._execute() 15896 1727203877.13777: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.13782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.13791: variable 'omit' from source: magic vars 15896 1727203877.14053: variable 'ansible_distribution_major_version' from source: facts 15896 1727203877.14064: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203877.14075: variable 'omit' from source: magic vars 15896 1727203877.14107: variable 'omit' from source: magic vars 15896 1727203877.14131: variable 'omit' from source: magic vars 15896 1727203877.14162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203877.14195: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203877.14210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203877.14223: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.14232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.14256: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203877.14260: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.14262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.14335: Set connection var ansible_shell_type to sh 15896 1727203877.14340: Set connection var ansible_connection to ssh 15896 1727203877.14346: Set connection var ansible_shell_executable to /bin/sh 15896 1727203877.14351: Set connection var ansible_pipelining to False 15896 1727203877.14356: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203877.14362: Set connection var ansible_timeout to 10 15896 1727203877.14382: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.14386: variable 'ansible_connection' from source: unknown 15896 1727203877.14390: variable 'ansible_module_compression' from source: unknown 15896 1727203877.14394: variable 'ansible_shell_type' from source: unknown 15896 1727203877.14397: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.14399: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.14401: variable 'ansible_pipelining' from source: unknown 15896 1727203877.14403: variable 'ansible_timeout' from source: unknown 15896 1727203877.14405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.14502: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203877.14510: variable 'omit' from source: magic vars 15896 1727203877.14520: starting attempt loop 15896 1727203877.14523: running the handler 15896 1727203877.14529: handler run complete 15896 1727203877.14538: attempt loop complete, returning result 15896 1727203877.14541: _execute() done 15896 1727203877.14543: dumping result to json 15896 1727203877.14545: done dumping result, returning 15896 1727203877.14552: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-fb83-b6ad-000000000674] 15896 1727203877.14556: sending task result for task 028d2410-947f-fb83-b6ad-000000000674 15896 1727203877.14639: done sending task result for task 028d2410-947f-fb83-b6ad-000000000674 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15896 1727203877.14696: no more pending results, returning what we have 15896 1727203877.14698: results queue empty 15896 1727203877.14699: checking for any_errors_fatal 15896 1727203877.14701: done checking for any_errors_fatal 15896 1727203877.14701: checking for max_fail_percentage 15896 1727203877.14703: done checking for max_fail_percentage 15896 1727203877.14703: checking to see if all hosts have failed and the running result is not ok 15896 1727203877.14704: done checking to see if all hosts have failed 15896 1727203877.14705: getting the remaining hosts for this loop 15896 1727203877.14706: done getting the remaining hosts for this loop 15896 1727203877.14709: getting the next task for host managed-node1 15896 1727203877.14716: done getting next task for host managed-node1 15896 1727203877.14718: ^ task is: TASK: Stat profile file 15896 1727203877.14722: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203877.14725: getting variables 15896 1727203877.14726: in VariableManager get_vars() 15896 1727203877.14780: Calling all_inventory to load vars for managed-node1 15896 1727203877.14783: Calling groups_inventory to load vars for managed-node1 15896 1727203877.14785: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203877.14791: WORKER PROCESS EXITING 15896 1727203877.14799: Calling all_plugins_play to load vars for managed-node1 15896 1727203877.14802: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203877.14804: Calling groups_plugins_play to load vars for managed-node1 15896 1727203877.15580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203877.16523: done with get_vars() 15896 1727203877.16546: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:51:17 -0400 (0:00:00.034) 0:00:22.755 ***** 15896 1727203877.16639: entering _queue_task() for managed-node1/stat 15896 1727203877.16977: worker is 1 (out of 1 available) 15896 1727203877.16991: exiting _queue_task() for managed-node1/stat 15896 1727203877.17003: done queuing things up, now waiting for results queue to drain 15896 1727203877.17004: waiting for pending results... 15896 1727203877.17285: running TaskExecutor() for managed-node1/TASK: Stat profile file 15896 1727203877.17416: in run() - task 028d2410-947f-fb83-b6ad-000000000675 15896 1727203877.17444: variable 'ansible_search_path' from source: unknown 15896 1727203877.17455: variable 'ansible_search_path' from source: unknown 15896 1727203877.17500: calling self._execute() 15896 1727203877.17584: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.17588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.17592: variable 'omit' from source: magic vars 15896 1727203877.17866: variable 'ansible_distribution_major_version' from source: facts 15896 1727203877.17877: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203877.17883: variable 'omit' from source: magic vars 15896 1727203877.17921: variable 'omit' from source: magic vars 15896 1727203877.17990: variable 'profile' from source: include params 15896 1727203877.17994: variable 'item' from source: include params 15896 1727203877.18041: variable 'item' from source: include params 15896 1727203877.18056: variable 'omit' from source: magic vars 15896 1727203877.18093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203877.18121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203877.18136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203877.18149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.18159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.18184: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203877.18187: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.18190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.18259: Set connection var ansible_shell_type to sh 15896 1727203877.18266: Set connection var ansible_connection to ssh 15896 1727203877.18271: Set connection var ansible_shell_executable to /bin/sh 15896 1727203877.18278: Set connection var ansible_pipelining to False 15896 1727203877.18283: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203877.18288: Set connection var ansible_timeout to 10 15896 1727203877.18307: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.18309: variable 'ansible_connection' from source: unknown 15896 1727203877.18312: variable 'ansible_module_compression' from source: unknown 15896 1727203877.18314: variable 'ansible_shell_type' from source: unknown 15896 1727203877.18316: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.18319: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.18323: variable 'ansible_pipelining' from source: unknown 15896 1727203877.18326: variable 'ansible_timeout' from source: unknown 15896 1727203877.18329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.18487: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203877.18494: variable 'omit' from source: magic vars 15896 1727203877.18500: starting attempt loop 15896 1727203877.18503: running the handler 15896 1727203877.18514: _low_level_execute_command(): starting 15896 1727203877.18521: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203877.19045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203877.19049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203877.19053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203877.19056: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203877.19107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203877.19111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.19113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.19202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.20998: stdout chunk (state=3): >>>/root <<< 15896 1727203877.21113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203877.21127: stderr chunk (state=3): >>><<< 15896 1727203877.21130: stdout chunk (state=3): >>><<< 15896 1727203877.21150: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203877.21162: _low_level_execute_command(): starting 15896 1727203877.21171: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725 `" && echo ansible-tmp-1727203877.2115016-17923-215611773246725="` echo /root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725 `" ) && sleep 0' 15896 1727203877.21605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203877.21609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203877.21618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15896 1727203877.21620: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203877.21623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203877.21657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.21679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.21754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.23992: stdout chunk (state=3): >>>ansible-tmp-1727203877.2115016-17923-215611773246725=/root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725 <<< 15896 1727203877.24073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203877.24084: stderr chunk (state=3): >>><<< 15896 1727203877.24088: stdout chunk (state=3): >>><<< 15896 1727203877.24091: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203877.2115016-17923-215611773246725=/root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203877.24139: variable 'ansible_module_compression' from source: unknown 15896 1727203877.24235: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15896 1727203877.24287: variable 'ansible_facts' from source: unknown 15896 1727203877.24410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/AnsiballZ_stat.py 15896 1727203877.24655: Sending initial data 15896 1727203877.24658: Sent initial data (153 bytes) 15896 1727203877.25355: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203877.25373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203877.25391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203877.25424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203877.25529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.25699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.25810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.27723: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203877.27925: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203877.28005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpjlxe_qsv /root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/AnsiballZ_stat.py <<< 15896 1727203877.28017: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/AnsiballZ_stat.py" <<< 15896 1727203877.28122: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpjlxe_qsv" to remote "/root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/AnsiballZ_stat.py" <<< 15896 1727203877.29494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203877.29545: stderr chunk (state=3): >>><<< 15896 1727203877.29548: stdout chunk (state=3): >>><<< 15896 1727203877.29739: done transferring module to remote 15896 1727203877.29742: _low_level_execute_command(): starting 15896 1727203877.29745: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/ /root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/AnsiballZ_stat.py && sleep 0' 15896 1727203877.30517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203877.30535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203877.30550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203877.30569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203877.30648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203877.30700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203877.30720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.30761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.30842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.32866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203877.32902: stdout chunk (state=3): >>><<< 15896 1727203877.32905: stderr chunk (state=3): >>><<< 15896 1727203877.33000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203877.33004: _low_level_execute_command(): starting 15896 1727203877.33006: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/AnsiballZ_stat.py && sleep 0' 15896 1727203877.33611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203877.33649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203877.33753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.33777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.33904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.50425: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15896 1727203877.52123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203877.52128: stdout chunk (state=3): >>><<< 15896 1727203877.52283: stderr chunk (state=3): >>><<< 15896 1727203877.52287: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203877.52290: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203877.52292: _low_level_execute_command(): starting 15896 1727203877.52294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203877.2115016-17923-215611773246725/ > /dev/null 2>&1 && sleep 0' 15896 1727203877.53316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203877.53340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203877.53380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203877.53399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203877.53446: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203877.53509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203877.53532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.53566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.53657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.55674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203877.55682: stdout chunk (state=3): >>><<< 15896 1727203877.55685: stderr chunk (state=3): >>><<< 15896 1727203877.55880: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203877.55884: handler run complete 15896 1727203877.55887: attempt loop complete, returning result 15896 1727203877.55889: _execute() done 15896 1727203877.55891: dumping result to json 15896 1727203877.55893: done dumping result, returning 15896 1727203877.55894: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-fb83-b6ad-000000000675] 15896 1727203877.55896: sending task result for task 028d2410-947f-fb83-b6ad-000000000675 15896 1727203877.55963: done sending task result for task 028d2410-947f-fb83-b6ad-000000000675 15896 1727203877.55967: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 15896 1727203877.56043: no more pending results, returning what we have 15896 1727203877.56047: results queue empty 15896 1727203877.56048: checking for any_errors_fatal 15896 1727203877.56056: done checking for any_errors_fatal 15896 1727203877.56057: checking for max_fail_percentage 15896 1727203877.56059: done checking for max_fail_percentage 15896 1727203877.56059: checking to see if all hosts have failed and the running result is not ok 15896 1727203877.56060: done checking to see if all hosts have failed 15896 1727203877.56063: getting the remaining hosts for this loop 15896 1727203877.56065: done getting the remaining hosts for this loop 15896 1727203877.56068: getting the next task for host managed-node1 15896 1727203877.56102: done getting next task for host managed-node1 15896 1727203877.56105: ^ task is: TASK: Set NM profile exist flag based on the profile files 15896 1727203877.56109: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203877.56112: getting variables 15896 1727203877.56114: in VariableManager get_vars() 15896 1727203877.56166: Calling all_inventory to load vars for managed-node1 15896 1727203877.56168: Calling groups_inventory to load vars for managed-node1 15896 1727203877.56170: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203877.56291: Calling all_plugins_play to load vars for managed-node1 15896 1727203877.56294: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203877.56298: Calling groups_plugins_play to load vars for managed-node1 15896 1727203877.57865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203877.58714: done with get_vars() 15896 1727203877.58731: done getting variables 15896 1727203877.58775: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:51:17 -0400 (0:00:00.421) 0:00:23.177 ***** 15896 1727203877.58798: entering _queue_task() for managed-node1/set_fact 15896 1727203877.59046: worker is 1 (out of 1 available) 15896 1727203877.59060: exiting _queue_task() for managed-node1/set_fact 15896 1727203877.59071: done queuing things up, now waiting for results queue to drain 15896 1727203877.59073: waiting for pending results... 15896 1727203877.59301: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 15896 1727203877.59434: in run() - task 028d2410-947f-fb83-b6ad-000000000676 15896 1727203877.59442: variable 'ansible_search_path' from source: unknown 15896 1727203877.59446: variable 'ansible_search_path' from source: unknown 15896 1727203877.59484: calling self._execute() 15896 1727203877.59686: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.59885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.59889: variable 'omit' from source: magic vars 15896 1727203877.60308: variable 'ansible_distribution_major_version' from source: facts 15896 1727203877.60328: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203877.60481: variable 'profile_stat' from source: set_fact 15896 1727203877.60484: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203877.60487: when evaluation is False, skipping this task 15896 1727203877.60490: _execute() done 15896 1727203877.60492: dumping result to json 15896 1727203877.60494: done dumping result, returning 15896 1727203877.60496: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-fb83-b6ad-000000000676] 15896 1727203877.60498: sending task result for task 028d2410-947f-fb83-b6ad-000000000676 15896 1727203877.60615: done sending task result for task 028d2410-947f-fb83-b6ad-000000000676 15896 1727203877.60618: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203877.60701: no more pending results, returning what we have 15896 1727203877.60704: results queue empty 15896 1727203877.60705: checking for any_errors_fatal 15896 1727203877.60711: done checking for any_errors_fatal 15896 1727203877.60712: checking for max_fail_percentage 15896 1727203877.60714: done checking for max_fail_percentage 15896 1727203877.60714: checking to see if all hosts have failed and the running result is not ok 15896 1727203877.60715: done checking to see if all hosts have failed 15896 1727203877.60715: getting the remaining hosts for this loop 15896 1727203877.60717: done getting the remaining hosts for this loop 15896 1727203877.60719: getting the next task for host managed-node1 15896 1727203877.60725: done getting next task for host managed-node1 15896 1727203877.60727: ^ task is: TASK: Get NM profile info 15896 1727203877.60730: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203877.60734: getting variables 15896 1727203877.60735: in VariableManager get_vars() 15896 1727203877.60777: Calling all_inventory to load vars for managed-node1 15896 1727203877.60780: Calling groups_inventory to load vars for managed-node1 15896 1727203877.60782: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203877.60791: Calling all_plugins_play to load vars for managed-node1 15896 1727203877.60793: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203877.60795: Calling groups_plugins_play to load vars for managed-node1 15896 1727203877.62131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203877.64348: done with get_vars() 15896 1727203877.64381: done getting variables 15896 1727203877.64445: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:51:17 -0400 (0:00:00.056) 0:00:23.234 ***** 15896 1727203877.64479: entering _queue_task() for managed-node1/shell 15896 1727203877.65001: worker is 1 (out of 1 available) 15896 1727203877.65014: exiting _queue_task() for managed-node1/shell 15896 1727203877.65085: done queuing things up, now waiting for results queue to drain 15896 1727203877.65087: waiting for pending results... 15896 1727203877.65368: running TaskExecutor() for managed-node1/TASK: Get NM profile info 15896 1727203877.65442: in run() - task 028d2410-947f-fb83-b6ad-000000000677 15896 1727203877.65471: variable 'ansible_search_path' from source: unknown 15896 1727203877.65481: variable 'ansible_search_path' from source: unknown 15896 1727203877.65574: calling self._execute() 15896 1727203877.65644: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.65659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.65680: variable 'omit' from source: magic vars 15896 1727203877.66092: variable 'ansible_distribution_major_version' from source: facts 15896 1727203877.66236: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203877.66240: variable 'omit' from source: magic vars 15896 1727203877.66242: variable 'omit' from source: magic vars 15896 1727203877.66501: variable 'profile' from source: include params 15896 1727203877.66505: variable 'item' from source: include params 15896 1727203877.66507: variable 'item' from source: include params 15896 1727203877.66538: variable 'omit' from source: magic vars 15896 1727203877.66586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203877.66635: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203877.66660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203877.66690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.66713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203877.66754: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203877.66762: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.66769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.66928: Set connection var ansible_shell_type to sh 15896 1727203877.66933: Set connection var ansible_connection to ssh 15896 1727203877.66935: Set connection var ansible_shell_executable to /bin/sh 15896 1727203877.66937: Set connection var ansible_pipelining to False 15896 1727203877.66943: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203877.66946: Set connection var ansible_timeout to 10 15896 1727203877.67056: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.67059: variable 'ansible_connection' from source: unknown 15896 1727203877.67061: variable 'ansible_module_compression' from source: unknown 15896 1727203877.67063: variable 'ansible_shell_type' from source: unknown 15896 1727203877.67065: variable 'ansible_shell_executable' from source: unknown 15896 1727203877.67066: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203877.67068: variable 'ansible_pipelining' from source: unknown 15896 1727203877.67070: variable 'ansible_timeout' from source: unknown 15896 1727203877.67072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203877.67153: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203877.67299: variable 'omit' from source: magic vars 15896 1727203877.67480: starting attempt loop 15896 1727203877.67483: running the handler 15896 1727203877.67486: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203877.67488: _low_level_execute_command(): starting 15896 1727203877.67490: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203877.68178: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203877.68197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203877.68214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203877.68233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203877.68279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203877.68296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203877.68386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.68408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.68521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.70320: stdout chunk (state=3): >>>/root <<< 15896 1727203877.70421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203877.70486: stderr chunk (state=3): >>><<< 15896 1727203877.70495: stdout chunk (state=3): >>><<< 15896 1727203877.70523: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203877.70544: _low_level_execute_command(): starting 15896 1727203877.70556: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466 `" && echo ansible-tmp-1727203877.7053106-17957-152282088779466="` echo /root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466 `" ) && sleep 0' 15896 1727203877.71271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203877.71295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203877.71311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203877.71327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203877.71366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203877.71383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203877.71395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203877.71479: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203877.71501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.71514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.71730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.73820: stdout chunk (state=3): >>>ansible-tmp-1727203877.7053106-17957-152282088779466=/root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466 <<< 15896 1727203877.73989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203877.73992: stdout chunk (state=3): >>><<< 15896 1727203877.73994: stderr chunk (state=3): >>><<< 15896 1727203877.74023: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203877.7053106-17957-152282088779466=/root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203877.74181: variable 'ansible_module_compression' from source: unknown 15896 1727203877.74184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203877.74186: variable 'ansible_facts' from source: unknown 15896 1727203877.74256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/AnsiballZ_command.py 15896 1727203877.74431: Sending initial data 15896 1727203877.74441: Sent initial data (156 bytes) 15896 1727203877.75100: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203877.75194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203877.75226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203877.75244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.75265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.75387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.77167: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203877.77247: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203877.77318: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpfxr8gv34 /root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/AnsiballZ_command.py <<< 15896 1727203877.77321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/AnsiballZ_command.py" <<< 15896 1727203877.77421: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpfxr8gv34" to remote "/root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/AnsiballZ_command.py" <<< 15896 1727203877.78397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203877.78401: stderr chunk (state=3): >>><<< 15896 1727203877.78403: stdout chunk (state=3): >>><<< 15896 1727203877.78411: done transferring module to remote 15896 1727203877.78425: _low_level_execute_command(): starting 15896 1727203877.78441: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/ /root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/AnsiballZ_command.py && sleep 0' 15896 1727203877.79095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203877.79190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203877.79200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203877.79230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203877.79245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.79265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.79367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203877.81349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203877.81356: stdout chunk (state=3): >>><<< 15896 1727203877.81364: stderr chunk (state=3): >>><<< 15896 1727203877.81384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203877.81391: _low_level_execute_command(): starting 15896 1727203877.81397: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/AnsiballZ_command.py && sleep 0' 15896 1727203877.81815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203877.81818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203877.81820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203877.81823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203877.81825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203877.81873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203877.81882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203877.81965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203878.00543: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:51:17.982052", "end": "2024-09-24 14:51:18.003755", "delta": "0:00:00.021703", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203878.02589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203878.02593: stdout chunk (state=3): >>><<< 15896 1727203878.02595: stderr chunk (state=3): >>><<< 15896 1727203878.02598: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:51:17.982052", "end": "2024-09-24 14:51:18.003755", "delta": "0:00:00.021703", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203878.02601: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203878.02608: _low_level_execute_command(): starting 15896 1727203878.02610: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203877.7053106-17957-152282088779466/ > /dev/null 2>&1 && sleep 0' 15896 1727203878.03119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.03142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.03185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203878.03198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203878.03285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203878.05367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203878.05371: stdout chunk (state=3): >>><<< 15896 1727203878.05374: stderr chunk (state=3): >>><<< 15896 1727203878.05399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203878.05417: handler run complete 15896 1727203878.05447: Evaluated conditional (False): False 15896 1727203878.05455: attempt loop complete, returning result 15896 1727203878.05458: _execute() done 15896 1727203878.05460: dumping result to json 15896 1727203878.05468: done dumping result, returning 15896 1727203878.05477: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-fb83-b6ad-000000000677] 15896 1727203878.05480: sending task result for task 028d2410-947f-fb83-b6ad-000000000677 15896 1727203878.05584: done sending task result for task 028d2410-947f-fb83-b6ad-000000000677 15896 1727203878.05587: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.021703", "end": "2024-09-24 14:51:18.003755", "rc": 0, "start": "2024-09-24 14:51:17.982052" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 15896 1727203878.05655: no more pending results, returning what we have 15896 1727203878.05658: results queue empty 15896 1727203878.05659: checking for any_errors_fatal 15896 1727203878.05666: done checking for any_errors_fatal 15896 1727203878.05666: checking for max_fail_percentage 15896 1727203878.05668: done checking for max_fail_percentage 15896 1727203878.05669: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.05669: done checking to see if all hosts have failed 15896 1727203878.05670: getting the remaining hosts for this loop 15896 1727203878.05672: done getting the remaining hosts for this loop 15896 1727203878.05677: getting the next task for host managed-node1 15896 1727203878.05684: done getting next task for host managed-node1 15896 1727203878.05687: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15896 1727203878.05691: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.05697: getting variables 15896 1727203878.05698: in VariableManager get_vars() 15896 1727203878.05746: Calling all_inventory to load vars for managed-node1 15896 1727203878.05749: Calling groups_inventory to load vars for managed-node1 15896 1727203878.05751: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.05761: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.05763: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.05766: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.06667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.07703: done with get_vars() 15896 1727203878.07725: done getting variables 15896 1727203878.07786: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.433) 0:00:23.667 ***** 15896 1727203878.07817: entering _queue_task() for managed-node1/set_fact 15896 1727203878.08137: worker is 1 (out of 1 available) 15896 1727203878.08150: exiting _queue_task() for managed-node1/set_fact 15896 1727203878.08162: done queuing things up, now waiting for results queue to drain 15896 1727203878.08164: waiting for pending results... 15896 1727203878.08592: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15896 1727203878.08598: in run() - task 028d2410-947f-fb83-b6ad-000000000678 15896 1727203878.08601: variable 'ansible_search_path' from source: unknown 15896 1727203878.08603: variable 'ansible_search_path' from source: unknown 15896 1727203878.08635: calling self._execute() 15896 1727203878.08763: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.08767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.08774: variable 'omit' from source: magic vars 15896 1727203878.09056: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.09066: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.09156: variable 'nm_profile_exists' from source: set_fact 15896 1727203878.09169: Evaluated conditional (nm_profile_exists.rc == 0): True 15896 1727203878.09174: variable 'omit' from source: magic vars 15896 1727203878.09211: variable 'omit' from source: magic vars 15896 1727203878.09233: variable 'omit' from source: magic vars 15896 1727203878.09267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203878.09295: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203878.09310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203878.09322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.09332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.09367: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203878.09370: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.09372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.09434: Set connection var ansible_shell_type to sh 15896 1727203878.09440: Set connection var ansible_connection to ssh 15896 1727203878.09445: Set connection var ansible_shell_executable to /bin/sh 15896 1727203878.09451: Set connection var ansible_pipelining to False 15896 1727203878.09457: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203878.09464: Set connection var ansible_timeout to 10 15896 1727203878.09483: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.09486: variable 'ansible_connection' from source: unknown 15896 1727203878.09488: variable 'ansible_module_compression' from source: unknown 15896 1727203878.09491: variable 'ansible_shell_type' from source: unknown 15896 1727203878.09493: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.09495: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.09499: variable 'ansible_pipelining' from source: unknown 15896 1727203878.09501: variable 'ansible_timeout' from source: unknown 15896 1727203878.09506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.09606: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203878.09615: variable 'omit' from source: magic vars 15896 1727203878.09620: starting attempt loop 15896 1727203878.09623: running the handler 15896 1727203878.09633: handler run complete 15896 1727203878.09641: attempt loop complete, returning result 15896 1727203878.09644: _execute() done 15896 1727203878.09646: dumping result to json 15896 1727203878.09648: done dumping result, returning 15896 1727203878.09656: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-fb83-b6ad-000000000678] 15896 1727203878.09660: sending task result for task 028d2410-947f-fb83-b6ad-000000000678 15896 1727203878.09756: done sending task result for task 028d2410-947f-fb83-b6ad-000000000678 15896 1727203878.09759: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15896 1727203878.09828: no more pending results, returning what we have 15896 1727203878.09831: results queue empty 15896 1727203878.09832: checking for any_errors_fatal 15896 1727203878.09840: done checking for any_errors_fatal 15896 1727203878.09840: checking for max_fail_percentage 15896 1727203878.09842: done checking for max_fail_percentage 15896 1727203878.09843: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.09843: done checking to see if all hosts have failed 15896 1727203878.09844: getting the remaining hosts for this loop 15896 1727203878.09846: done getting the remaining hosts for this loop 15896 1727203878.09849: getting the next task for host managed-node1 15896 1727203878.09858: done getting next task for host managed-node1 15896 1727203878.09860: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15896 1727203878.09865: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.09869: getting variables 15896 1727203878.09870: in VariableManager get_vars() 15896 1727203878.09914: Calling all_inventory to load vars for managed-node1 15896 1727203878.09917: Calling groups_inventory to load vars for managed-node1 15896 1727203878.09919: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.09927: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.09929: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.09932: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.11035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.12457: done with get_vars() 15896 1727203878.12477: done getting variables 15896 1727203878.12521: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203878.12608: variable 'profile' from source: include params 15896 1727203878.12612: variable 'item' from source: include params 15896 1727203878.12652: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.048) 0:00:23.716 ***** 15896 1727203878.12682: entering _queue_task() for managed-node1/command 15896 1727203878.12916: worker is 1 (out of 1 available) 15896 1727203878.12929: exiting _queue_task() for managed-node1/command 15896 1727203878.12939: done queuing things up, now waiting for results queue to drain 15896 1727203878.12941: waiting for pending results... 15896 1727203878.13118: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 15896 1727203878.13195: in run() - task 028d2410-947f-fb83-b6ad-00000000067a 15896 1727203878.13207: variable 'ansible_search_path' from source: unknown 15896 1727203878.13211: variable 'ansible_search_path' from source: unknown 15896 1727203878.13238: calling self._execute() 15896 1727203878.13315: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.13318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.13327: variable 'omit' from source: magic vars 15896 1727203878.13586: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.13595: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.13679: variable 'profile_stat' from source: set_fact 15896 1727203878.13690: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203878.13693: when evaluation is False, skipping this task 15896 1727203878.13696: _execute() done 15896 1727203878.13698: dumping result to json 15896 1727203878.13700: done dumping result, returning 15896 1727203878.13709: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [028d2410-947f-fb83-b6ad-00000000067a] 15896 1727203878.13719: sending task result for task 028d2410-947f-fb83-b6ad-00000000067a 15896 1727203878.13795: done sending task result for task 028d2410-947f-fb83-b6ad-00000000067a 15896 1727203878.13798: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203878.13896: no more pending results, returning what we have 15896 1727203878.13899: results queue empty 15896 1727203878.13900: checking for any_errors_fatal 15896 1727203878.13905: done checking for any_errors_fatal 15896 1727203878.13906: checking for max_fail_percentage 15896 1727203878.13907: done checking for max_fail_percentage 15896 1727203878.13908: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.13909: done checking to see if all hosts have failed 15896 1727203878.13909: getting the remaining hosts for this loop 15896 1727203878.13911: done getting the remaining hosts for this loop 15896 1727203878.13914: getting the next task for host managed-node1 15896 1727203878.13919: done getting next task for host managed-node1 15896 1727203878.13921: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15896 1727203878.13925: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.13928: getting variables 15896 1727203878.13930: in VariableManager get_vars() 15896 1727203878.13973: Calling all_inventory to load vars for managed-node1 15896 1727203878.13977: Calling groups_inventory to load vars for managed-node1 15896 1727203878.13979: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.13988: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.13990: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.13993: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.18385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.19296: done with get_vars() 15896 1727203878.19317: done getting variables 15896 1727203878.19366: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203878.19459: variable 'profile' from source: include params 15896 1727203878.19462: variable 'item' from source: include params 15896 1727203878.19523: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.068) 0:00:23.784 ***** 15896 1727203878.19550: entering _queue_task() for managed-node1/set_fact 15896 1727203878.19895: worker is 1 (out of 1 available) 15896 1727203878.19909: exiting _queue_task() for managed-node1/set_fact 15896 1727203878.19921: done queuing things up, now waiting for results queue to drain 15896 1727203878.19923: waiting for pending results... 15896 1727203878.20255: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 15896 1727203878.20331: in run() - task 028d2410-947f-fb83-b6ad-00000000067b 15896 1727203878.20350: variable 'ansible_search_path' from source: unknown 15896 1727203878.20355: variable 'ansible_search_path' from source: unknown 15896 1727203878.20377: calling self._execute() 15896 1727203878.20464: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.20468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.20478: variable 'omit' from source: magic vars 15896 1727203878.20747: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.20755: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.20841: variable 'profile_stat' from source: set_fact 15896 1727203878.20852: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203878.20855: when evaluation is False, skipping this task 15896 1727203878.20859: _execute() done 15896 1727203878.20864: dumping result to json 15896 1727203878.20868: done dumping result, returning 15896 1727203878.20870: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [028d2410-947f-fb83-b6ad-00000000067b] 15896 1727203878.20876: sending task result for task 028d2410-947f-fb83-b6ad-00000000067b 15896 1727203878.20964: done sending task result for task 028d2410-947f-fb83-b6ad-00000000067b 15896 1727203878.20967: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203878.21013: no more pending results, returning what we have 15896 1727203878.21017: results queue empty 15896 1727203878.21017: checking for any_errors_fatal 15896 1727203878.21024: done checking for any_errors_fatal 15896 1727203878.21024: checking for max_fail_percentage 15896 1727203878.21026: done checking for max_fail_percentage 15896 1727203878.21027: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.21027: done checking to see if all hosts have failed 15896 1727203878.21028: getting the remaining hosts for this loop 15896 1727203878.21030: done getting the remaining hosts for this loop 15896 1727203878.21033: getting the next task for host managed-node1 15896 1727203878.21041: done getting next task for host managed-node1 15896 1727203878.21043: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15896 1727203878.21046: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.21051: getting variables 15896 1727203878.21052: in VariableManager get_vars() 15896 1727203878.21106: Calling all_inventory to load vars for managed-node1 15896 1727203878.21109: Calling groups_inventory to load vars for managed-node1 15896 1727203878.21111: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.21121: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.21123: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.21126: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.21893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.23512: done with get_vars() 15896 1727203878.23533: done getting variables 15896 1727203878.23594: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203878.23702: variable 'profile' from source: include params 15896 1727203878.23706: variable 'item' from source: include params 15896 1727203878.23765: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.042) 0:00:23.827 ***** 15896 1727203878.23795: entering _queue_task() for managed-node1/command 15896 1727203878.24093: worker is 1 (out of 1 available) 15896 1727203878.24107: exiting _queue_task() for managed-node1/command 15896 1727203878.24119: done queuing things up, now waiting for results queue to drain 15896 1727203878.24121: waiting for pending results... 15896 1727203878.24502: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 15896 1727203878.24531: in run() - task 028d2410-947f-fb83-b6ad-00000000067c 15896 1727203878.24550: variable 'ansible_search_path' from source: unknown 15896 1727203878.24557: variable 'ansible_search_path' from source: unknown 15896 1727203878.24605: calling self._execute() 15896 1727203878.24711: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.24723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.24741: variable 'omit' from source: magic vars 15896 1727203878.25124: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.25145: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.25357: variable 'profile_stat' from source: set_fact 15896 1727203878.25360: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203878.25365: when evaluation is False, skipping this task 15896 1727203878.25367: _execute() done 15896 1727203878.25369: dumping result to json 15896 1727203878.25371: done dumping result, returning 15896 1727203878.25374: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [028d2410-947f-fb83-b6ad-00000000067c] 15896 1727203878.25378: sending task result for task 028d2410-947f-fb83-b6ad-00000000067c 15896 1727203878.25441: done sending task result for task 028d2410-947f-fb83-b6ad-00000000067c 15896 1727203878.25444: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203878.25500: no more pending results, returning what we have 15896 1727203878.25504: results queue empty 15896 1727203878.25504: checking for any_errors_fatal 15896 1727203878.25510: done checking for any_errors_fatal 15896 1727203878.25511: checking for max_fail_percentage 15896 1727203878.25513: done checking for max_fail_percentage 15896 1727203878.25514: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.25514: done checking to see if all hosts have failed 15896 1727203878.25515: getting the remaining hosts for this loop 15896 1727203878.25517: done getting the remaining hosts for this loop 15896 1727203878.25520: getting the next task for host managed-node1 15896 1727203878.25527: done getting next task for host managed-node1 15896 1727203878.25530: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15896 1727203878.25534: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.25538: getting variables 15896 1727203878.25539: in VariableManager get_vars() 15896 1727203878.25599: Calling all_inventory to load vars for managed-node1 15896 1727203878.25602: Calling groups_inventory to load vars for managed-node1 15896 1727203878.25604: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.25618: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.25621: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.25624: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.27153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.28704: done with get_vars() 15896 1727203878.28726: done getting variables 15896 1727203878.28787: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203878.28897: variable 'profile' from source: include params 15896 1727203878.28901: variable 'item' from source: include params 15896 1727203878.28957: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.051) 0:00:23.879 ***** 15896 1727203878.28993: entering _queue_task() for managed-node1/set_fact 15896 1727203878.29294: worker is 1 (out of 1 available) 15896 1727203878.29306: exiting _queue_task() for managed-node1/set_fact 15896 1727203878.29317: done queuing things up, now waiting for results queue to drain 15896 1727203878.29319: waiting for pending results... 15896 1727203878.29698: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 15896 1727203878.29735: in run() - task 028d2410-947f-fb83-b6ad-00000000067d 15896 1727203878.29754: variable 'ansible_search_path' from source: unknown 15896 1727203878.29764: variable 'ansible_search_path' from source: unknown 15896 1727203878.29807: calling self._execute() 15896 1727203878.29910: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.29921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.29934: variable 'omit' from source: magic vars 15896 1727203878.30304: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.30320: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.30444: variable 'profile_stat' from source: set_fact 15896 1727203878.30467: Evaluated conditional (profile_stat.stat.exists): False 15896 1727203878.30478: when evaluation is False, skipping this task 15896 1727203878.30485: _execute() done 15896 1727203878.30552: dumping result to json 15896 1727203878.30555: done dumping result, returning 15896 1727203878.30558: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [028d2410-947f-fb83-b6ad-00000000067d] 15896 1727203878.30563: sending task result for task 028d2410-947f-fb83-b6ad-00000000067d 15896 1727203878.30633: done sending task result for task 028d2410-947f-fb83-b6ad-00000000067d 15896 1727203878.30636: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15896 1727203878.30708: no more pending results, returning what we have 15896 1727203878.30711: results queue empty 15896 1727203878.30712: checking for any_errors_fatal 15896 1727203878.30719: done checking for any_errors_fatal 15896 1727203878.30720: checking for max_fail_percentage 15896 1727203878.30722: done checking for max_fail_percentage 15896 1727203878.30723: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.30723: done checking to see if all hosts have failed 15896 1727203878.30724: getting the remaining hosts for this loop 15896 1727203878.30726: done getting the remaining hosts for this loop 15896 1727203878.30729: getting the next task for host managed-node1 15896 1727203878.30737: done getting next task for host managed-node1 15896 1727203878.30740: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15896 1727203878.30744: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.30748: getting variables 15896 1727203878.30750: in VariableManager get_vars() 15896 1727203878.30817: Calling all_inventory to load vars for managed-node1 15896 1727203878.30820: Calling groups_inventory to load vars for managed-node1 15896 1727203878.30823: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.30835: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.30838: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.30841: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.32569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.36472: done with get_vars() 15896 1727203878.36507: done getting variables 15896 1727203878.36574: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203878.36907: variable 'profile' from source: include params 15896 1727203878.36912: variable 'item' from source: include params 15896 1727203878.37180: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.082) 0:00:23.961 ***** 15896 1727203878.37210: entering _queue_task() for managed-node1/assert 15896 1727203878.37789: worker is 1 (out of 1 available) 15896 1727203878.37801: exiting _queue_task() for managed-node1/assert 15896 1727203878.37814: done queuing things up, now waiting for results queue to drain 15896 1727203878.37815: waiting for pending results... 15896 1727203878.38430: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' 15896 1727203878.38683: in run() - task 028d2410-947f-fb83-b6ad-000000000364 15896 1727203878.38688: variable 'ansible_search_path' from source: unknown 15896 1727203878.38691: variable 'ansible_search_path' from source: unknown 15896 1727203878.38693: calling self._execute() 15896 1727203878.38696: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.38699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.38702: variable 'omit' from source: magic vars 15896 1727203878.39047: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.39065: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.39079: variable 'omit' from source: magic vars 15896 1727203878.39125: variable 'omit' from source: magic vars 15896 1727203878.39223: variable 'profile' from source: include params 15896 1727203878.39233: variable 'item' from source: include params 15896 1727203878.39298: variable 'item' from source: include params 15896 1727203878.39322: variable 'omit' from source: magic vars 15896 1727203878.39371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203878.39412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203878.39437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203878.39461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.39479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.39514: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203878.39522: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.39530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.39628: Set connection var ansible_shell_type to sh 15896 1727203878.39641: Set connection var ansible_connection to ssh 15896 1727203878.39650: Set connection var ansible_shell_executable to /bin/sh 15896 1727203878.39659: Set connection var ansible_pipelining to False 15896 1727203878.39667: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203878.39678: Set connection var ansible_timeout to 10 15896 1727203878.39702: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.39709: variable 'ansible_connection' from source: unknown 15896 1727203878.39716: variable 'ansible_module_compression' from source: unknown 15896 1727203878.39724: variable 'ansible_shell_type' from source: unknown 15896 1727203878.39727: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.39729: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.39733: variable 'ansible_pipelining' from source: unknown 15896 1727203878.39741: variable 'ansible_timeout' from source: unknown 15896 1727203878.39748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.39881: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203878.39896: variable 'omit' from source: magic vars 15896 1727203878.39905: starting attempt loop 15896 1727203878.39911: running the handler 15896 1727203878.40018: variable 'lsr_net_profile_exists' from source: set_fact 15896 1727203878.40028: Evaluated conditional (lsr_net_profile_exists): True 15896 1727203878.40038: handler run complete 15896 1727203878.40055: attempt loop complete, returning result 15896 1727203878.40062: _execute() done 15896 1727203878.40069: dumping result to json 15896 1727203878.40078: done dumping result, returning 15896 1727203878.40088: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' [028d2410-947f-fb83-b6ad-000000000364] 15896 1727203878.40284: sending task result for task 028d2410-947f-fb83-b6ad-000000000364 15896 1727203878.40353: done sending task result for task 028d2410-947f-fb83-b6ad-000000000364 15896 1727203878.40355: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203878.40421: no more pending results, returning what we have 15896 1727203878.40424: results queue empty 15896 1727203878.40425: checking for any_errors_fatal 15896 1727203878.40430: done checking for any_errors_fatal 15896 1727203878.40430: checking for max_fail_percentage 15896 1727203878.40432: done checking for max_fail_percentage 15896 1727203878.40433: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.40434: done checking to see if all hosts have failed 15896 1727203878.40434: getting the remaining hosts for this loop 15896 1727203878.40435: done getting the remaining hosts for this loop 15896 1727203878.40438: getting the next task for host managed-node1 15896 1727203878.40443: done getting next task for host managed-node1 15896 1727203878.40446: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15896 1727203878.40448: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.40451: getting variables 15896 1727203878.40453: in VariableManager get_vars() 15896 1727203878.40503: Calling all_inventory to load vars for managed-node1 15896 1727203878.40505: Calling groups_inventory to load vars for managed-node1 15896 1727203878.40508: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.40517: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.40519: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.40522: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.41945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.43545: done with get_vars() 15896 1727203878.43571: done getting variables 15896 1727203878.43638: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203878.43759: variable 'profile' from source: include params 15896 1727203878.43763: variable 'item' from source: include params 15896 1727203878.43822: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.066) 0:00:24.027 ***** 15896 1727203878.43864: entering _queue_task() for managed-node1/assert 15896 1727203878.44388: worker is 1 (out of 1 available) 15896 1727203878.44400: exiting _queue_task() for managed-node1/assert 15896 1727203878.44411: done queuing things up, now waiting for results queue to drain 15896 1727203878.44413: waiting for pending results... 15896 1727203878.44713: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 15896 1727203878.44817: in run() - task 028d2410-947f-fb83-b6ad-000000000365 15896 1727203878.44830: variable 'ansible_search_path' from source: unknown 15896 1727203878.44834: variable 'ansible_search_path' from source: unknown 15896 1727203878.44869: calling self._execute() 15896 1727203878.44977: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.44981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.45001: variable 'omit' from source: magic vars 15896 1727203878.45385: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.45396: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.45402: variable 'omit' from source: magic vars 15896 1727203878.45445: variable 'omit' from source: magic vars 15896 1727203878.45550: variable 'profile' from source: include params 15896 1727203878.45553: variable 'item' from source: include params 15896 1727203878.45619: variable 'item' from source: include params 15896 1727203878.45780: variable 'omit' from source: magic vars 15896 1727203878.45783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203878.45786: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203878.45788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203878.45790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.45792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.45802: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203878.45805: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.45810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.45917: Set connection var ansible_shell_type to sh 15896 1727203878.45924: Set connection var ansible_connection to ssh 15896 1727203878.45930: Set connection var ansible_shell_executable to /bin/sh 15896 1727203878.45935: Set connection var ansible_pipelining to False 15896 1727203878.45940: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203878.45946: Set connection var ansible_timeout to 10 15896 1727203878.45979: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.45982: variable 'ansible_connection' from source: unknown 15896 1727203878.45985: variable 'ansible_module_compression' from source: unknown 15896 1727203878.45987: variable 'ansible_shell_type' from source: unknown 15896 1727203878.45990: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.45992: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.45994: variable 'ansible_pipelining' from source: unknown 15896 1727203878.45997: variable 'ansible_timeout' from source: unknown 15896 1727203878.46002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.46139: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203878.46149: variable 'omit' from source: magic vars 15896 1727203878.46155: starting attempt loop 15896 1727203878.46158: running the handler 15896 1727203878.46271: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15896 1727203878.46274: Evaluated conditional (lsr_net_profile_ansible_managed): True 15896 1727203878.46282: handler run complete 15896 1727203878.46305: attempt loop complete, returning result 15896 1727203878.46308: _execute() done 15896 1727203878.46310: dumping result to json 15896 1727203878.46313: done dumping result, returning 15896 1727203878.46321: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [028d2410-947f-fb83-b6ad-000000000365] 15896 1727203878.46480: sending task result for task 028d2410-947f-fb83-b6ad-000000000365 15896 1727203878.46546: done sending task result for task 028d2410-947f-fb83-b6ad-000000000365 15896 1727203878.46550: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203878.46601: no more pending results, returning what we have 15896 1727203878.46604: results queue empty 15896 1727203878.46605: checking for any_errors_fatal 15896 1727203878.46612: done checking for any_errors_fatal 15896 1727203878.46613: checking for max_fail_percentage 15896 1727203878.46615: done checking for max_fail_percentage 15896 1727203878.46615: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.46616: done checking to see if all hosts have failed 15896 1727203878.46617: getting the remaining hosts for this loop 15896 1727203878.46618: done getting the remaining hosts for this loop 15896 1727203878.46621: getting the next task for host managed-node1 15896 1727203878.46627: done getting next task for host managed-node1 15896 1727203878.46629: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15896 1727203878.46632: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.46636: getting variables 15896 1727203878.46637: in VariableManager get_vars() 15896 1727203878.46692: Calling all_inventory to load vars for managed-node1 15896 1727203878.46694: Calling groups_inventory to load vars for managed-node1 15896 1727203878.46697: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.46707: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.46710: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.46713: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.48405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.50131: done with get_vars() 15896 1727203878.50163: done getting variables 15896 1727203878.50221: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203878.50339: variable 'profile' from source: include params 15896 1727203878.50344: variable 'item' from source: include params 15896 1727203878.50417: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.065) 0:00:24.093 ***** 15896 1727203878.50456: entering _queue_task() for managed-node1/assert 15896 1727203878.50919: worker is 1 (out of 1 available) 15896 1727203878.50930: exiting _queue_task() for managed-node1/assert 15896 1727203878.50940: done queuing things up, now waiting for results queue to drain 15896 1727203878.50941: waiting for pending results... 15896 1727203878.51131: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 15896 1727203878.51243: in run() - task 028d2410-947f-fb83-b6ad-000000000366 15896 1727203878.51258: variable 'ansible_search_path' from source: unknown 15896 1727203878.51261: variable 'ansible_search_path' from source: unknown 15896 1727203878.51302: calling self._execute() 15896 1727203878.51406: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.51412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.51424: variable 'omit' from source: magic vars 15896 1727203878.51816: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.51980: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.51984: variable 'omit' from source: magic vars 15896 1727203878.51986: variable 'omit' from source: magic vars 15896 1727203878.51988: variable 'profile' from source: include params 15896 1727203878.51991: variable 'item' from source: include params 15896 1727203878.52054: variable 'item' from source: include params 15896 1727203878.52077: variable 'omit' from source: magic vars 15896 1727203878.52120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203878.52159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203878.52182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203878.52200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.52216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.52246: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203878.52256: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.52259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.52374: Set connection var ansible_shell_type to sh 15896 1727203878.52383: Set connection var ansible_connection to ssh 15896 1727203878.52389: Set connection var ansible_shell_executable to /bin/sh 15896 1727203878.52394: Set connection var ansible_pipelining to False 15896 1727203878.52400: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203878.52405: Set connection var ansible_timeout to 10 15896 1727203878.52432: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.52435: variable 'ansible_connection' from source: unknown 15896 1727203878.52437: variable 'ansible_module_compression' from source: unknown 15896 1727203878.52440: variable 'ansible_shell_type' from source: unknown 15896 1727203878.52442: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.52444: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.52448: variable 'ansible_pipelining' from source: unknown 15896 1727203878.52451: variable 'ansible_timeout' from source: unknown 15896 1727203878.52454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.52605: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203878.52615: variable 'omit' from source: magic vars 15896 1727203878.52620: starting attempt loop 15896 1727203878.52623: running the handler 15896 1727203878.52781: variable 'lsr_net_profile_fingerprint' from source: set_fact 15896 1727203878.52785: Evaluated conditional (lsr_net_profile_fingerprint): True 15896 1727203878.52787: handler run complete 15896 1727203878.52789: attempt loop complete, returning result 15896 1727203878.52791: _execute() done 15896 1727203878.52794: dumping result to json 15896 1727203878.52796: done dumping result, returning 15896 1727203878.52799: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 [028d2410-947f-fb83-b6ad-000000000366] 15896 1727203878.52801: sending task result for task 028d2410-947f-fb83-b6ad-000000000366 15896 1727203878.52860: done sending task result for task 028d2410-947f-fb83-b6ad-000000000366 15896 1727203878.52863: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203878.52914: no more pending results, returning what we have 15896 1727203878.52918: results queue empty 15896 1727203878.52919: checking for any_errors_fatal 15896 1727203878.52928: done checking for any_errors_fatal 15896 1727203878.52928: checking for max_fail_percentage 15896 1727203878.52931: done checking for max_fail_percentage 15896 1727203878.52932: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.52933: done checking to see if all hosts have failed 15896 1727203878.52933: getting the remaining hosts for this loop 15896 1727203878.52935: done getting the remaining hosts for this loop 15896 1727203878.52939: getting the next task for host managed-node1 15896 1727203878.52949: done getting next task for host managed-node1 15896 1727203878.52953: ^ task is: TASK: ** TEST check polling interval 15896 1727203878.52955: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.52960: getting variables 15896 1727203878.52967: in VariableManager get_vars() 15896 1727203878.53028: Calling all_inventory to load vars for managed-node1 15896 1727203878.53030: Calling groups_inventory to load vars for managed-node1 15896 1727203878.53033: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.53045: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.53047: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.53050: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.54901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.56144: done with get_vars() 15896 1727203878.56159: done getting variables 15896 1727203878.56206: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.057) 0:00:24.151 ***** 15896 1727203878.56226: entering _queue_task() for managed-node1/command 15896 1727203878.56467: worker is 1 (out of 1 available) 15896 1727203878.56482: exiting _queue_task() for managed-node1/command 15896 1727203878.56494: done queuing things up, now waiting for results queue to drain 15896 1727203878.56496: waiting for pending results... 15896 1727203878.56666: running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval 15896 1727203878.56730: in run() - task 028d2410-947f-fb83-b6ad-000000000071 15896 1727203878.56743: variable 'ansible_search_path' from source: unknown 15896 1727203878.56774: calling self._execute() 15896 1727203878.56856: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.56860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.56871: variable 'omit' from source: magic vars 15896 1727203878.57160: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.57164: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.57190: variable 'omit' from source: magic vars 15896 1727203878.57193: variable 'omit' from source: magic vars 15896 1727203878.57420: variable 'controller_device' from source: play vars 15896 1727203878.57424: variable 'omit' from source: magic vars 15896 1727203878.57426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203878.57429: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203878.57431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203878.57433: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.57435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.57463: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203878.57466: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.57468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.57560: Set connection var ansible_shell_type to sh 15896 1727203878.57572: Set connection var ansible_connection to ssh 15896 1727203878.57577: Set connection var ansible_shell_executable to /bin/sh 15896 1727203878.57583: Set connection var ansible_pipelining to False 15896 1727203878.57589: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203878.57594: Set connection var ansible_timeout to 10 15896 1727203878.57617: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.57620: variable 'ansible_connection' from source: unknown 15896 1727203878.57622: variable 'ansible_module_compression' from source: unknown 15896 1727203878.57625: variable 'ansible_shell_type' from source: unknown 15896 1727203878.57627: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.57629: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.57633: variable 'ansible_pipelining' from source: unknown 15896 1727203878.57636: variable 'ansible_timeout' from source: unknown 15896 1727203878.57638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.57778: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203878.57878: variable 'omit' from source: magic vars 15896 1727203878.57882: starting attempt loop 15896 1727203878.57884: running the handler 15896 1727203878.57886: _low_level_execute_command(): starting 15896 1727203878.57889: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203878.58459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203878.58478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203878.58489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.58504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203878.58508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203878.58535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.58538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.58541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.58587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203878.58609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203878.58698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203878.60500: stdout chunk (state=3): >>>/root <<< 15896 1727203878.60653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203878.60657: stdout chunk (state=3): >>><<< 15896 1727203878.60660: stderr chunk (state=3): >>><<< 15896 1727203878.60697: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203878.60789: _low_level_execute_command(): starting 15896 1727203878.60793: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044 `" && echo ansible-tmp-1727203878.6070395-18003-204147491285044="` echo /root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044 `" ) && sleep 0' 15896 1727203878.61310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203878.61331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203878.61347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.61364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203878.61441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.61497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203878.61514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203878.61539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203878.61658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203878.63779: stdout chunk (state=3): >>>ansible-tmp-1727203878.6070395-18003-204147491285044=/root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044 <<< 15896 1727203878.63942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203878.63945: stdout chunk (state=3): >>><<< 15896 1727203878.63948: stderr chunk (state=3): >>><<< 15896 1727203878.64181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203878.6070395-18003-204147491285044=/root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203878.64185: variable 'ansible_module_compression' from source: unknown 15896 1727203878.64188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203878.64190: variable 'ansible_facts' from source: unknown 15896 1727203878.64202: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/AnsiballZ_command.py 15896 1727203878.64441: Sending initial data 15896 1727203878.64444: Sent initial data (156 bytes) 15896 1727203878.65010: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203878.65024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203878.65039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.65086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203878.65103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.65120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203878.65196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.65218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203878.65234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203878.65261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203878.65363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203878.67101: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203878.67203: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203878.67282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpk00iexo2 /root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/AnsiballZ_command.py <<< 15896 1727203878.67291: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/AnsiballZ_command.py" <<< 15896 1727203878.67350: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpk00iexo2" to remote "/root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/AnsiballZ_command.py" <<< 15896 1727203878.68322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203878.68325: stdout chunk (state=3): >>><<< 15896 1727203878.68327: stderr chunk (state=3): >>><<< 15896 1727203878.68430: done transferring module to remote 15896 1727203878.68434: _low_level_execute_command(): starting 15896 1727203878.68436: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/ /root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/AnsiballZ_command.py && sleep 0' 15896 1727203878.69047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203878.69065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203878.69094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.69199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.69220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203878.69239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203878.69264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203878.69367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203878.71355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203878.71363: stdout chunk (state=3): >>><<< 15896 1727203878.71371: stderr chunk (state=3): >>><<< 15896 1727203878.71391: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203878.71396: _low_level_execute_command(): starting 15896 1727203878.71402: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/AnsiballZ_command.py && sleep 0' 15896 1727203878.71807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203878.71837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.71840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.71884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203878.71900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203878.71990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203878.88704: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:51:18.881737", "end": "2024-09-24 14:51:18.885408", "delta": "0:00:00.003671", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203878.90497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203878.90517: stderr chunk (state=3): >>><<< 15896 1727203878.90520: stdout chunk (state=3): >>><<< 15896 1727203878.90542: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:51:18.881737", "end": "2024-09-24 14:51:18.885408", "delta": "0:00:00.003671", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203878.90574: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203878.90583: _low_level_execute_command(): starting 15896 1727203878.90588: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203878.6070395-18003-204147491285044/ > /dev/null 2>&1 && sleep 0' 15896 1727203878.91042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203878.91045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203878.91050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.91052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.91058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.91111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203878.91114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203878.91120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203878.91199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203878.93189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203878.93199: stderr chunk (state=3): >>><<< 15896 1727203878.93202: stdout chunk (state=3): >>><<< 15896 1727203878.93241: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203878.93244: handler run complete 15896 1727203878.93258: Evaluated conditional (False): False 15896 1727203878.93393: variable 'result' from source: unknown 15896 1727203878.93406: Evaluated conditional ('110' in result.stdout): True 15896 1727203878.93415: attempt loop complete, returning result 15896 1727203878.93418: _execute() done 15896 1727203878.93420: dumping result to json 15896 1727203878.93425: done dumping result, returning 15896 1727203878.93433: done running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval [028d2410-947f-fb83-b6ad-000000000071] 15896 1727203878.93437: sending task result for task 028d2410-947f-fb83-b6ad-000000000071 15896 1727203878.93538: done sending task result for task 028d2410-947f-fb83-b6ad-000000000071 15896 1727203878.93541: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003671", "end": "2024-09-24 14:51:18.885408", "rc": 0, "start": "2024-09-24 14:51:18.881737" } STDOUT: MII Polling Interval (ms): 110 15896 1727203878.93641: no more pending results, returning what we have 15896 1727203878.93644: results queue empty 15896 1727203878.93645: checking for any_errors_fatal 15896 1727203878.93652: done checking for any_errors_fatal 15896 1727203878.93652: checking for max_fail_percentage 15896 1727203878.93655: done checking for max_fail_percentage 15896 1727203878.93655: checking to see if all hosts have failed and the running result is not ok 15896 1727203878.93656: done checking to see if all hosts have failed 15896 1727203878.93656: getting the remaining hosts for this loop 15896 1727203878.93658: done getting the remaining hosts for this loop 15896 1727203878.93661: getting the next task for host managed-node1 15896 1727203878.93667: done getting next task for host managed-node1 15896 1727203878.93671: ^ task is: TASK: ** TEST check IPv4 15896 1727203878.93672: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203878.93677: getting variables 15896 1727203878.93678: in VariableManager get_vars() 15896 1727203878.93726: Calling all_inventory to load vars for managed-node1 15896 1727203878.93728: Calling groups_inventory to load vars for managed-node1 15896 1727203878.93730: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203878.93740: Calling all_plugins_play to load vars for managed-node1 15896 1727203878.93742: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203878.93744: Calling groups_plugins_play to load vars for managed-node1 15896 1727203878.94531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203878.95712: done with get_vars() 15896 1727203878.95735: done getting variables 15896 1727203878.95791: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Tuesday 24 September 2024 14:51:18 -0400 (0:00:00.395) 0:00:24.547 ***** 15896 1727203878.95816: entering _queue_task() for managed-node1/command 15896 1727203878.96117: worker is 1 (out of 1 available) 15896 1727203878.96129: exiting _queue_task() for managed-node1/command 15896 1727203878.96141: done queuing things up, now waiting for results queue to drain 15896 1727203878.96142: waiting for pending results... 15896 1727203878.96504: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 15896 1727203878.96583: in run() - task 028d2410-947f-fb83-b6ad-000000000072 15896 1727203878.96587: variable 'ansible_search_path' from source: unknown 15896 1727203878.96590: calling self._execute() 15896 1727203878.96689: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.96700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.96720: variable 'omit' from source: magic vars 15896 1727203878.97105: variable 'ansible_distribution_major_version' from source: facts 15896 1727203878.97143: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203878.97147: variable 'omit' from source: magic vars 15896 1727203878.97162: variable 'omit' from source: magic vars 15896 1727203878.97267: variable 'controller_device' from source: play vars 15896 1727203878.97359: variable 'omit' from source: magic vars 15896 1727203878.97362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203878.97379: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203878.97404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203878.97425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.97439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203878.97478: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203878.97487: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.97494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.97593: Set connection var ansible_shell_type to sh 15896 1727203878.97606: Set connection var ansible_connection to ssh 15896 1727203878.97615: Set connection var ansible_shell_executable to /bin/sh 15896 1727203878.97624: Set connection var ansible_pipelining to False 15896 1727203878.97690: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203878.97693: Set connection var ansible_timeout to 10 15896 1727203878.97695: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.97697: variable 'ansible_connection' from source: unknown 15896 1727203878.97699: variable 'ansible_module_compression' from source: unknown 15896 1727203878.97701: variable 'ansible_shell_type' from source: unknown 15896 1727203878.97703: variable 'ansible_shell_executable' from source: unknown 15896 1727203878.97705: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203878.97707: variable 'ansible_pipelining' from source: unknown 15896 1727203878.97709: variable 'ansible_timeout' from source: unknown 15896 1727203878.97710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203878.97844: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203878.97862: variable 'omit' from source: magic vars 15896 1727203878.97871: starting attempt loop 15896 1727203878.97879: running the handler 15896 1727203878.97898: _low_level_execute_command(): starting 15896 1727203878.97914: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203878.98613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203878.98630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203878.98650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203878.98669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203878.98695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203878.98766: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203878.98806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203878.98826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203878.98851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203878.98976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.00759: stdout chunk (state=3): >>>/root <<< 15896 1727203879.00895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.00910: stdout chunk (state=3): >>><<< 15896 1727203879.00923: stderr chunk (state=3): >>><<< 15896 1727203879.01046: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203879.01051: _low_level_execute_command(): starting 15896 1727203879.01055: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822 `" && echo ansible-tmp-1727203879.0095074-18017-119479985361822="` echo /root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822 `" ) && sleep 0' 15896 1727203879.01694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203879.01709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203879.01731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.01750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203879.01844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.01879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.01899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.01921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.02102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.04207: stdout chunk (state=3): >>>ansible-tmp-1727203879.0095074-18017-119479985361822=/root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822 <<< 15896 1727203879.04369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.04385: stderr chunk (state=3): >>><<< 15896 1727203879.04394: stdout chunk (state=3): >>><<< 15896 1727203879.04482: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203879.0095074-18017-119479985361822=/root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203879.04486: variable 'ansible_module_compression' from source: unknown 15896 1727203879.04526: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203879.04572: variable 'ansible_facts' from source: unknown 15896 1727203879.04669: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/AnsiballZ_command.py 15896 1727203879.04968: Sending initial data 15896 1727203879.04971: Sent initial data (156 bytes) 15896 1727203879.05583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.05611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.05614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.05637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.05759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.07496: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15896 1727203879.07501: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203879.07567: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203879.07641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpswyweh8j /root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/AnsiballZ_command.py <<< 15896 1727203879.07647: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/AnsiballZ_command.py" <<< 15896 1727203879.07717: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpswyweh8j" to remote "/root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/AnsiballZ_command.py" <<< 15896 1727203879.07720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/AnsiballZ_command.py" <<< 15896 1727203879.08374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.08416: stderr chunk (state=3): >>><<< 15896 1727203879.08420: stdout chunk (state=3): >>><<< 15896 1727203879.08473: done transferring module to remote 15896 1727203879.08484: _low_level_execute_command(): starting 15896 1727203879.08488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/ /root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/AnsiballZ_command.py && sleep 0' 15896 1727203879.09166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.09198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.09341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.11265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.11291: stderr chunk (state=3): >>><<< 15896 1727203879.11294: stdout chunk (state=3): >>><<< 15896 1727203879.11307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203879.11310: _low_level_execute_command(): starting 15896 1727203879.11315: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/AnsiballZ_command.py && sleep 0' 15896 1727203879.11898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203879.11901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.11974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.11980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.12103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.28919: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.120/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:51:19.283622", "end": "2024-09-24 14:51:19.287578", "delta": "0:00:00.003956", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203879.30614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203879.30642: stderr chunk (state=3): >>><<< 15896 1727203879.30646: stdout chunk (state=3): >>><<< 15896 1727203879.30661: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.120/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:51:19.283622", "end": "2024-09-24 14:51:19.287578", "delta": "0:00:00.003956", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203879.30701: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203879.30709: _low_level_execute_command(): starting 15896 1727203879.30714: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203879.0095074-18017-119479985361822/ > /dev/null 2>&1 && sleep 0' 15896 1727203879.31158: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.31166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.31168: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203879.31175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203879.31186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.31221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.31225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.31229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.31305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.33254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.33280: stderr chunk (state=3): >>><<< 15896 1727203879.33283: stdout chunk (state=3): >>><<< 15896 1727203879.33299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203879.33305: handler run complete 15896 1727203879.33326: Evaluated conditional (False): False 15896 1727203879.33439: variable 'result' from source: set_fact 15896 1727203879.33452: Evaluated conditional ('192.0.2' in result.stdout): True 15896 1727203879.33461: attempt loop complete, returning result 15896 1727203879.33467: _execute() done 15896 1727203879.33469: dumping result to json 15896 1727203879.33477: done dumping result, returning 15896 1727203879.33486: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 [028d2410-947f-fb83-b6ad-000000000072] 15896 1727203879.33489: sending task result for task 028d2410-947f-fb83-b6ad-000000000072 15896 1727203879.33587: done sending task result for task 028d2410-947f-fb83-b6ad-000000000072 15896 1727203879.33590: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003956", "end": "2024-09-24 14:51:19.287578", "rc": 0, "start": "2024-09-24 14:51:19.283622" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.120/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 236sec preferred_lft 236sec 15896 1727203879.33663: no more pending results, returning what we have 15896 1727203879.33666: results queue empty 15896 1727203879.33667: checking for any_errors_fatal 15896 1727203879.33684: done checking for any_errors_fatal 15896 1727203879.33684: checking for max_fail_percentage 15896 1727203879.33687: done checking for max_fail_percentage 15896 1727203879.33687: checking to see if all hosts have failed and the running result is not ok 15896 1727203879.33688: done checking to see if all hosts have failed 15896 1727203879.33689: getting the remaining hosts for this loop 15896 1727203879.33690: done getting the remaining hosts for this loop 15896 1727203879.33693: getting the next task for host managed-node1 15896 1727203879.33701: done getting next task for host managed-node1 15896 1727203879.33703: ^ task is: TASK: ** TEST check IPv6 15896 1727203879.33705: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203879.33708: getting variables 15896 1727203879.33709: in VariableManager get_vars() 15896 1727203879.33759: Calling all_inventory to load vars for managed-node1 15896 1727203879.33761: Calling groups_inventory to load vars for managed-node1 15896 1727203879.33763: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203879.33773: Calling all_plugins_play to load vars for managed-node1 15896 1727203879.33780: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203879.33784: Calling groups_plugins_play to load vars for managed-node1 15896 1727203879.34710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203879.35549: done with get_vars() 15896 1727203879.35566: done getting variables 15896 1727203879.35609: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Tuesday 24 September 2024 14:51:19 -0400 (0:00:00.398) 0:00:24.945 ***** 15896 1727203879.35630: entering _queue_task() for managed-node1/command 15896 1727203879.35861: worker is 1 (out of 1 available) 15896 1727203879.35877: exiting _queue_task() for managed-node1/command 15896 1727203879.35890: done queuing things up, now waiting for results queue to drain 15896 1727203879.35892: waiting for pending results... 15896 1727203879.36064: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 15896 1727203879.36131: in run() - task 028d2410-947f-fb83-b6ad-000000000073 15896 1727203879.36143: variable 'ansible_search_path' from source: unknown 15896 1727203879.36174: calling self._execute() 15896 1727203879.36256: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.36261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.36273: variable 'omit' from source: magic vars 15896 1727203879.36544: variable 'ansible_distribution_major_version' from source: facts 15896 1727203879.36554: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203879.36568: variable 'omit' from source: magic vars 15896 1727203879.36586: variable 'omit' from source: magic vars 15896 1727203879.36650: variable 'controller_device' from source: play vars 15896 1727203879.36668: variable 'omit' from source: magic vars 15896 1727203879.36706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203879.36732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203879.36747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203879.36760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203879.36774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203879.36802: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203879.36806: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.36808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.36873: Set connection var ansible_shell_type to sh 15896 1727203879.36887: Set connection var ansible_connection to ssh 15896 1727203879.36890: Set connection var ansible_shell_executable to /bin/sh 15896 1727203879.36893: Set connection var ansible_pipelining to False 15896 1727203879.36897: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203879.36902: Set connection var ansible_timeout to 10 15896 1727203879.36919: variable 'ansible_shell_executable' from source: unknown 15896 1727203879.36922: variable 'ansible_connection' from source: unknown 15896 1727203879.36925: variable 'ansible_module_compression' from source: unknown 15896 1727203879.36927: variable 'ansible_shell_type' from source: unknown 15896 1727203879.36930: variable 'ansible_shell_executable' from source: unknown 15896 1727203879.36932: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.36934: variable 'ansible_pipelining' from source: unknown 15896 1727203879.36936: variable 'ansible_timeout' from source: unknown 15896 1727203879.36941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.37044: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203879.37054: variable 'omit' from source: magic vars 15896 1727203879.37059: starting attempt loop 15896 1727203879.37065: running the handler 15896 1727203879.37077: _low_level_execute_command(): starting 15896 1727203879.37085: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203879.37604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.37608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.37610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.37613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.37662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.37667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.37669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.37755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.39588: stdout chunk (state=3): >>>/root <<< 15896 1727203879.39678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.39710: stdout chunk (state=3): >>><<< 15896 1727203879.39714: stderr chunk (state=3): >>><<< 15896 1727203879.39735: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203879.39836: _low_level_execute_command(): starting 15896 1727203879.39840: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163 `" && echo ansible-tmp-1727203879.397433-18033-256521314237163="` echo /root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163 `" ) && sleep 0' 15896 1727203879.40462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.40514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.40521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.40608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.42718: stdout chunk (state=3): >>>ansible-tmp-1727203879.397433-18033-256521314237163=/root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163 <<< 15896 1727203879.42849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.42852: stdout chunk (state=3): >>><<< 15896 1727203879.42859: stderr chunk (state=3): >>><<< 15896 1727203879.42876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203879.397433-18033-256521314237163=/root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203879.42903: variable 'ansible_module_compression' from source: unknown 15896 1727203879.42943: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203879.42976: variable 'ansible_facts' from source: unknown 15896 1727203879.43028: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/AnsiballZ_command.py 15896 1727203879.43129: Sending initial data 15896 1727203879.43132: Sent initial data (155 bytes) 15896 1727203879.43583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.43586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203879.43588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203879.43590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.43643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.43646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.43649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.43734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.45446: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203879.45524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203879.45601: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpr_3pxrh_ /root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/AnsiballZ_command.py <<< 15896 1727203879.45608: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/AnsiballZ_command.py" <<< 15896 1727203879.45679: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpr_3pxrh_" to remote "/root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/AnsiballZ_command.py" <<< 15896 1727203879.45687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/AnsiballZ_command.py" <<< 15896 1727203879.46355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.46393: stderr chunk (state=3): >>><<< 15896 1727203879.46396: stdout chunk (state=3): >>><<< 15896 1727203879.46435: done transferring module to remote 15896 1727203879.46452: _low_level_execute_command(): starting 15896 1727203879.46455: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/ /root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/AnsiballZ_command.py && sleep 0' 15896 1727203879.46872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.46875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.46879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203879.46883: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.46885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.46930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.46933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.47018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.49011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.49014: stdout chunk (state=3): >>><<< 15896 1727203879.49017: stderr chunk (state=3): >>><<< 15896 1727203879.49031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203879.49038: _low_level_execute_command(): starting 15896 1727203879.49047: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/AnsiballZ_command.py && sleep 0' 15896 1727203879.49647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203879.49662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203879.49679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.49696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203879.49714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203879.49794: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203879.49798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.49839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.49858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.49888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.49997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.66994: stdout chunk (state=3): >>> {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::13/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::701d:e2ff:fe7a:b896/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::701d:e2ff:fe7a:b896/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:51:19.664431", "end": "2024-09-24 14:51:19.668365", "delta": "0:00:00.003934", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203879.68803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203879.68825: stderr chunk (state=3): >>><<< 15896 1727203879.68829: stdout chunk (state=3): >>><<< 15896 1727203879.68846: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::13/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::701d:e2ff:fe7a:b896/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::701d:e2ff:fe7a:b896/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:51:19.664431", "end": "2024-09-24 14:51:19.668365", "delta": "0:00:00.003934", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203879.68885: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203879.68891: _low_level_execute_command(): starting 15896 1727203879.68896: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203879.397433-18033-256521314237163/ > /dev/null 2>&1 && sleep 0' 15896 1727203879.69323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.69331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.69333: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.69335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.69381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.69384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.69465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.71442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.71480: stderr chunk (state=3): >>><<< 15896 1727203879.71486: stdout chunk (state=3): >>><<< 15896 1727203879.71581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203879.71585: handler run complete 15896 1727203879.71587: Evaluated conditional (False): False 15896 1727203879.71648: variable 'result' from source: set_fact 15896 1727203879.71661: Evaluated conditional ('2001' in result.stdout): True 15896 1727203879.71672: attempt loop complete, returning result 15896 1727203879.71680: _execute() done 15896 1727203879.71683: dumping result to json 15896 1727203879.71685: done dumping result, returning 15896 1727203879.71692: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 [028d2410-947f-fb83-b6ad-000000000073] 15896 1727203879.71696: sending task result for task 028d2410-947f-fb83-b6ad-000000000073 15896 1727203879.71792: done sending task result for task 028d2410-947f-fb83-b6ad-000000000073 15896 1727203879.71795: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003934", "end": "2024-09-24 14:51:19.668365", "rc": 0, "start": "2024-09-24 14:51:19.664431" } STDOUT: 19: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::13/128 scope global dynamic noprefixroute valid_lft 236sec preferred_lft 236sec inet6 2001:db8::701d:e2ff:fe7a:b896/64 scope global dynamic noprefixroute valid_lft 1795sec preferred_lft 1795sec inet6 fe80::701d:e2ff:fe7a:b896/64 scope link noprefixroute valid_lft forever preferred_lft forever 15896 1727203879.71888: no more pending results, returning what we have 15896 1727203879.71892: results queue empty 15896 1727203879.71893: checking for any_errors_fatal 15896 1727203879.71902: done checking for any_errors_fatal 15896 1727203879.71903: checking for max_fail_percentage 15896 1727203879.71905: done checking for max_fail_percentage 15896 1727203879.71906: checking to see if all hosts have failed and the running result is not ok 15896 1727203879.71907: done checking to see if all hosts have failed 15896 1727203879.71907: getting the remaining hosts for this loop 15896 1727203879.71909: done getting the remaining hosts for this loop 15896 1727203879.71912: getting the next task for host managed-node1 15896 1727203879.71918: done getting next task for host managed-node1 15896 1727203879.71925: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203879.71927: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203879.71944: getting variables 15896 1727203879.71945: in VariableManager get_vars() 15896 1727203879.71996: Calling all_inventory to load vars for managed-node1 15896 1727203879.71999: Calling groups_inventory to load vars for managed-node1 15896 1727203879.72001: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203879.72010: Calling all_plugins_play to load vars for managed-node1 15896 1727203879.72012: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203879.72015: Calling groups_plugins_play to load vars for managed-node1 15896 1727203879.72801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203879.74048: done with get_vars() 15896 1727203879.74073: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:51:19 -0400 (0:00:00.385) 0:00:25.331 ***** 15896 1727203879.74170: entering _queue_task() for managed-node1/include_tasks 15896 1727203879.74483: worker is 1 (out of 1 available) 15896 1727203879.74498: exiting _queue_task() for managed-node1/include_tasks 15896 1727203879.74511: done queuing things up, now waiting for results queue to drain 15896 1727203879.74513: waiting for pending results... 15896 1727203879.74731: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203879.74848: in run() - task 028d2410-947f-fb83-b6ad-00000000007b 15896 1727203879.74862: variable 'ansible_search_path' from source: unknown 15896 1727203879.74865: variable 'ansible_search_path' from source: unknown 15896 1727203879.74969: calling self._execute() 15896 1727203879.75012: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.75016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.75019: variable 'omit' from source: magic vars 15896 1727203879.75406: variable 'ansible_distribution_major_version' from source: facts 15896 1727203879.75449: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203879.75454: _execute() done 15896 1727203879.75457: dumping result to json 15896 1727203879.75460: done dumping result, returning 15896 1727203879.75463: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-fb83-b6ad-00000000007b] 15896 1727203879.75466: sending task result for task 028d2410-947f-fb83-b6ad-00000000007b 15896 1727203879.75534: done sending task result for task 028d2410-947f-fb83-b6ad-00000000007b 15896 1727203879.75537: WORKER PROCESS EXITING 15896 1727203879.75599: no more pending results, returning what we have 15896 1727203879.75604: in VariableManager get_vars() 15896 1727203879.75756: Calling all_inventory to load vars for managed-node1 15896 1727203879.75759: Calling groups_inventory to load vars for managed-node1 15896 1727203879.75764: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203879.75773: Calling all_plugins_play to load vars for managed-node1 15896 1727203879.75778: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203879.75781: Calling groups_plugins_play to load vars for managed-node1 15896 1727203879.76836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203879.77680: done with get_vars() 15896 1727203879.77693: variable 'ansible_search_path' from source: unknown 15896 1727203879.77694: variable 'ansible_search_path' from source: unknown 15896 1727203879.77722: we have included files to process 15896 1727203879.77723: generating all_blocks data 15896 1727203879.77725: done generating all_blocks data 15896 1727203879.77728: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203879.77729: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203879.77730: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203879.78178: done processing included file 15896 1727203879.78180: iterating over new_blocks loaded from include file 15896 1727203879.78182: in VariableManager get_vars() 15896 1727203879.78215: done with get_vars() 15896 1727203879.78217: filtering new block on tags 15896 1727203879.78234: done filtering new block on tags 15896 1727203879.78237: in VariableManager get_vars() 15896 1727203879.78268: done with get_vars() 15896 1727203879.78270: filtering new block on tags 15896 1727203879.78294: done filtering new block on tags 15896 1727203879.78296: in VariableManager get_vars() 15896 1727203879.78327: done with get_vars() 15896 1727203879.78329: filtering new block on tags 15896 1727203879.78348: done filtering new block on tags 15896 1727203879.78349: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 15896 1727203879.78355: extending task lists for all hosts with included blocks 15896 1727203879.79083: done extending task lists 15896 1727203879.79084: done processing included files 15896 1727203879.79084: results queue empty 15896 1727203879.79085: checking for any_errors_fatal 15896 1727203879.79088: done checking for any_errors_fatal 15896 1727203879.79088: checking for max_fail_percentage 15896 1727203879.79089: done checking for max_fail_percentage 15896 1727203879.79089: checking to see if all hosts have failed and the running result is not ok 15896 1727203879.79090: done checking to see if all hosts have failed 15896 1727203879.79090: getting the remaining hosts for this loop 15896 1727203879.79091: done getting the remaining hosts for this loop 15896 1727203879.79093: getting the next task for host managed-node1 15896 1727203879.79096: done getting next task for host managed-node1 15896 1727203879.79097: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203879.79099: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203879.79106: getting variables 15896 1727203879.79107: in VariableManager get_vars() 15896 1727203879.79120: Calling all_inventory to load vars for managed-node1 15896 1727203879.79122: Calling groups_inventory to load vars for managed-node1 15896 1727203879.79123: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203879.79127: Calling all_plugins_play to load vars for managed-node1 15896 1727203879.79128: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203879.79130: Calling groups_plugins_play to load vars for managed-node1 15896 1727203879.79789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203879.80691: done with get_vars() 15896 1727203879.80705: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:51:19 -0400 (0:00:00.065) 0:00:25.396 ***** 15896 1727203879.80755: entering _queue_task() for managed-node1/setup 15896 1727203879.81009: worker is 1 (out of 1 available) 15896 1727203879.81021: exiting _queue_task() for managed-node1/setup 15896 1727203879.81034: done queuing things up, now waiting for results queue to drain 15896 1727203879.81035: waiting for pending results... 15896 1727203879.81223: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203879.81333: in run() - task 028d2410-947f-fb83-b6ad-0000000006c5 15896 1727203879.81347: variable 'ansible_search_path' from source: unknown 15896 1727203879.81351: variable 'ansible_search_path' from source: unknown 15896 1727203879.81383: calling self._execute() 15896 1727203879.81454: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.81459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.81471: variable 'omit' from source: magic vars 15896 1727203879.81743: variable 'ansible_distribution_major_version' from source: facts 15896 1727203879.81753: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203879.81901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203879.83363: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203879.83425: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203879.83455: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203879.83484: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203879.83504: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203879.83563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203879.83588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203879.83605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203879.83630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203879.83640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203879.83684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203879.83700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203879.83716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203879.83740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203879.83750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203879.83862: variable '__network_required_facts' from source: role '' defaults 15896 1727203879.83874: variable 'ansible_facts' from source: unknown 15896 1727203879.84311: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15896 1727203879.84315: when evaluation is False, skipping this task 15896 1727203879.84318: _execute() done 15896 1727203879.84320: dumping result to json 15896 1727203879.84323: done dumping result, returning 15896 1727203879.84326: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-fb83-b6ad-0000000006c5] 15896 1727203879.84331: sending task result for task 028d2410-947f-fb83-b6ad-0000000006c5 15896 1727203879.84416: done sending task result for task 028d2410-947f-fb83-b6ad-0000000006c5 15896 1727203879.84418: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203879.84463: no more pending results, returning what we have 15896 1727203879.84467: results queue empty 15896 1727203879.84468: checking for any_errors_fatal 15896 1727203879.84469: done checking for any_errors_fatal 15896 1727203879.84470: checking for max_fail_percentage 15896 1727203879.84472: done checking for max_fail_percentage 15896 1727203879.84472: checking to see if all hosts have failed and the running result is not ok 15896 1727203879.84473: done checking to see if all hosts have failed 15896 1727203879.84474: getting the remaining hosts for this loop 15896 1727203879.84477: done getting the remaining hosts for this loop 15896 1727203879.84480: getting the next task for host managed-node1 15896 1727203879.84488: done getting next task for host managed-node1 15896 1727203879.84491: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203879.84495: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203879.84511: getting variables 15896 1727203879.84512: in VariableManager get_vars() 15896 1727203879.84571: Calling all_inventory to load vars for managed-node1 15896 1727203879.84574: Calling groups_inventory to load vars for managed-node1 15896 1727203879.84578: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203879.84588: Calling all_plugins_play to load vars for managed-node1 15896 1727203879.84590: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203879.84593: Calling groups_plugins_play to load vars for managed-node1 15896 1727203879.85391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203879.86257: done with get_vars() 15896 1727203879.86274: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:51:19 -0400 (0:00:00.055) 0:00:25.452 ***** 15896 1727203879.86348: entering _queue_task() for managed-node1/stat 15896 1727203879.86586: worker is 1 (out of 1 available) 15896 1727203879.86599: exiting _queue_task() for managed-node1/stat 15896 1727203879.86610: done queuing things up, now waiting for results queue to drain 15896 1727203879.86612: waiting for pending results... 15896 1727203879.86793: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203879.86903: in run() - task 028d2410-947f-fb83-b6ad-0000000006c7 15896 1727203879.86916: variable 'ansible_search_path' from source: unknown 15896 1727203879.86920: variable 'ansible_search_path' from source: unknown 15896 1727203879.86951: calling self._execute() 15896 1727203879.87026: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.87033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.87041: variable 'omit' from source: magic vars 15896 1727203879.87309: variable 'ansible_distribution_major_version' from source: facts 15896 1727203879.87319: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203879.87432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203879.87623: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203879.87654: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203879.87682: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203879.87712: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203879.87773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203879.87793: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203879.87812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203879.87832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203879.87896: variable '__network_is_ostree' from source: set_fact 15896 1727203879.87900: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203879.87903: when evaluation is False, skipping this task 15896 1727203879.87906: _execute() done 15896 1727203879.87910: dumping result to json 15896 1727203879.87913: done dumping result, returning 15896 1727203879.87920: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-fb83-b6ad-0000000006c7] 15896 1727203879.87924: sending task result for task 028d2410-947f-fb83-b6ad-0000000006c7 15896 1727203879.88007: done sending task result for task 028d2410-947f-fb83-b6ad-0000000006c7 15896 1727203879.88009: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203879.88085: no more pending results, returning what we have 15896 1727203879.88088: results queue empty 15896 1727203879.88089: checking for any_errors_fatal 15896 1727203879.88093: done checking for any_errors_fatal 15896 1727203879.88094: checking for max_fail_percentage 15896 1727203879.88096: done checking for max_fail_percentage 15896 1727203879.88096: checking to see if all hosts have failed and the running result is not ok 15896 1727203879.88097: done checking to see if all hosts have failed 15896 1727203879.88098: getting the remaining hosts for this loop 15896 1727203879.88099: done getting the remaining hosts for this loop 15896 1727203879.88103: getting the next task for host managed-node1 15896 1727203879.88108: done getting next task for host managed-node1 15896 1727203879.88111: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203879.88114: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203879.88129: getting variables 15896 1727203879.88130: in VariableManager get_vars() 15896 1727203879.88174: Calling all_inventory to load vars for managed-node1 15896 1727203879.88179: Calling groups_inventory to load vars for managed-node1 15896 1727203879.88181: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203879.88189: Calling all_plugins_play to load vars for managed-node1 15896 1727203879.88191: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203879.88194: Calling groups_plugins_play to load vars for managed-node1 15896 1727203879.89084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203879.89933: done with get_vars() 15896 1727203879.89948: done getting variables 15896 1727203879.89993: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:51:19 -0400 (0:00:00.036) 0:00:25.489 ***** 15896 1727203879.90015: entering _queue_task() for managed-node1/set_fact 15896 1727203879.90242: worker is 1 (out of 1 available) 15896 1727203879.90256: exiting _queue_task() for managed-node1/set_fact 15896 1727203879.90269: done queuing things up, now waiting for results queue to drain 15896 1727203879.90270: waiting for pending results... 15896 1727203879.90440: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203879.90549: in run() - task 028d2410-947f-fb83-b6ad-0000000006c8 15896 1727203879.90560: variable 'ansible_search_path' from source: unknown 15896 1727203879.90564: variable 'ansible_search_path' from source: unknown 15896 1727203879.90595: calling self._execute() 15896 1727203879.90667: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.90672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.90683: variable 'omit' from source: magic vars 15896 1727203879.90949: variable 'ansible_distribution_major_version' from source: facts 15896 1727203879.90958: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203879.91073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203879.91257: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203879.91294: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203879.91319: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203879.91346: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203879.91412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203879.91431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203879.91449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203879.91470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203879.91535: variable '__network_is_ostree' from source: set_fact 15896 1727203879.91539: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203879.91542: when evaluation is False, skipping this task 15896 1727203879.91544: _execute() done 15896 1727203879.91549: dumping result to json 15896 1727203879.91551: done dumping result, returning 15896 1727203879.91559: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-fb83-b6ad-0000000006c8] 15896 1727203879.91566: sending task result for task 028d2410-947f-fb83-b6ad-0000000006c8 15896 1727203879.91648: done sending task result for task 028d2410-947f-fb83-b6ad-0000000006c8 15896 1727203879.91651: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203879.91698: no more pending results, returning what we have 15896 1727203879.91702: results queue empty 15896 1727203879.91702: checking for any_errors_fatal 15896 1727203879.91707: done checking for any_errors_fatal 15896 1727203879.91708: checking for max_fail_percentage 15896 1727203879.91710: done checking for max_fail_percentage 15896 1727203879.91711: checking to see if all hosts have failed and the running result is not ok 15896 1727203879.91711: done checking to see if all hosts have failed 15896 1727203879.91712: getting the remaining hosts for this loop 15896 1727203879.91713: done getting the remaining hosts for this loop 15896 1727203879.91717: getting the next task for host managed-node1 15896 1727203879.91725: done getting next task for host managed-node1 15896 1727203879.91728: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203879.91731: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203879.91747: getting variables 15896 1727203879.91749: in VariableManager get_vars() 15896 1727203879.91793: Calling all_inventory to load vars for managed-node1 15896 1727203879.91796: Calling groups_inventory to load vars for managed-node1 15896 1727203879.91798: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203879.91805: Calling all_plugins_play to load vars for managed-node1 15896 1727203879.91808: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203879.91810: Calling groups_plugins_play to load vars for managed-node1 15896 1727203879.92566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203879.93417: done with get_vars() 15896 1727203879.93431: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:51:19 -0400 (0:00:00.034) 0:00:25.524 ***** 15896 1727203879.93498: entering _queue_task() for managed-node1/service_facts 15896 1727203879.93719: worker is 1 (out of 1 available) 15896 1727203879.93733: exiting _queue_task() for managed-node1/service_facts 15896 1727203879.93746: done queuing things up, now waiting for results queue to drain 15896 1727203879.93747: waiting for pending results... 15896 1727203879.93920: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203879.94027: in run() - task 028d2410-947f-fb83-b6ad-0000000006ca 15896 1727203879.94039: variable 'ansible_search_path' from source: unknown 15896 1727203879.94042: variable 'ansible_search_path' from source: unknown 15896 1727203879.94072: calling self._execute() 15896 1727203879.94145: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.94149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.94157: variable 'omit' from source: magic vars 15896 1727203879.94424: variable 'ansible_distribution_major_version' from source: facts 15896 1727203879.94433: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203879.94439: variable 'omit' from source: magic vars 15896 1727203879.94489: variable 'omit' from source: magic vars 15896 1727203879.94513: variable 'omit' from source: magic vars 15896 1727203879.94548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203879.94574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203879.94591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203879.94603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203879.94613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203879.94639: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203879.94642: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.94644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.94713: Set connection var ansible_shell_type to sh 15896 1727203879.94719: Set connection var ansible_connection to ssh 15896 1727203879.94725: Set connection var ansible_shell_executable to /bin/sh 15896 1727203879.94729: Set connection var ansible_pipelining to False 15896 1727203879.94740: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203879.94743: Set connection var ansible_timeout to 10 15896 1727203879.94757: variable 'ansible_shell_executable' from source: unknown 15896 1727203879.94760: variable 'ansible_connection' from source: unknown 15896 1727203879.94762: variable 'ansible_module_compression' from source: unknown 15896 1727203879.94767: variable 'ansible_shell_type' from source: unknown 15896 1727203879.94770: variable 'ansible_shell_executable' from source: unknown 15896 1727203879.94772: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203879.94778: variable 'ansible_pipelining' from source: unknown 15896 1727203879.94780: variable 'ansible_timeout' from source: unknown 15896 1727203879.94783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203879.94924: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203879.94934: variable 'omit' from source: magic vars 15896 1727203879.94939: starting attempt loop 15896 1727203879.94941: running the handler 15896 1727203879.94960: _low_level_execute_command(): starting 15896 1727203879.94963: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203879.95474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.95480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.95483: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203879.95485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.95536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.95539: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203879.95541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.95632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203879.97403: stdout chunk (state=3): >>>/root <<< 15896 1727203879.97505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203879.97536: stderr chunk (state=3): >>><<< 15896 1727203879.97539: stdout chunk (state=3): >>><<< 15896 1727203879.97560: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203879.97573: _low_level_execute_command(): starting 15896 1727203879.97581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349 `" && echo ansible-tmp-1727203879.9755938-18062-150749438654349="` echo /root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349 `" ) && sleep 0' 15896 1727203879.98024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203879.98028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203879.98030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.98039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203879.98042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203879.98085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203879.98103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203879.98181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203880.00264: stdout chunk (state=3): >>>ansible-tmp-1727203879.9755938-18062-150749438654349=/root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349 <<< 15896 1727203880.00374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203880.00399: stderr chunk (state=3): >>><<< 15896 1727203880.00402: stdout chunk (state=3): >>><<< 15896 1727203880.00414: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203879.9755938-18062-150749438654349=/root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203880.00454: variable 'ansible_module_compression' from source: unknown 15896 1727203880.00493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15896 1727203880.00523: variable 'ansible_facts' from source: unknown 15896 1727203880.00588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/AnsiballZ_service_facts.py 15896 1727203880.00688: Sending initial data 15896 1727203880.00691: Sent initial data (162 bytes) 15896 1727203880.01127: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203880.01130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203880.01132: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203880.01134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203880.01136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203880.01181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203880.01187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203880.01268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203880.03002: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15896 1727203880.03006: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203880.03082: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203880.03165: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmppfvjadlh /root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/AnsiballZ_service_facts.py <<< 15896 1727203880.03167: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/AnsiballZ_service_facts.py" <<< 15896 1727203880.03229: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmppfvjadlh" to remote "/root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/AnsiballZ_service_facts.py" <<< 15896 1727203880.03234: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/AnsiballZ_service_facts.py" <<< 15896 1727203880.03927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203880.03963: stderr chunk (state=3): >>><<< 15896 1727203880.03966: stdout chunk (state=3): >>><<< 15896 1727203880.03993: done transferring module to remote 15896 1727203880.04001: _low_level_execute_command(): starting 15896 1727203880.04006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/ /root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/AnsiballZ_service_facts.py && sleep 0' 15896 1727203880.04439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203880.04442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203880.04445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203880.04447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203880.04453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203880.04508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203880.04511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203880.04516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203880.04594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203880.06546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203880.06573: stderr chunk (state=3): >>><<< 15896 1727203880.06579: stdout chunk (state=3): >>><<< 15896 1727203880.06590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203880.06593: _low_level_execute_command(): starting 15896 1727203880.06600: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/AnsiballZ_service_facts.py && sleep 0' 15896 1727203880.07040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203880.07044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203880.07047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15896 1727203880.07049: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203880.07051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203880.07097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203880.07104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203880.07105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203880.07191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203881.83723: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 15896 1727203881.83742: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 15896 1727203881.83784: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 15896 1727203881.83788: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 15896 1727203881.83792: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15896 1727203881.85536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203881.85568: stderr chunk (state=3): >>><<< 15896 1727203881.85573: stdout chunk (state=3): >>><<< 15896 1727203881.85604: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203881.86268: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203881.86274: _low_level_execute_command(): starting 15896 1727203881.86281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203879.9755938-18062-150749438654349/ > /dev/null 2>&1 && sleep 0' 15896 1727203881.86746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203881.86750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203881.86752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203881.86754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203881.86756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203881.86811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203881.86814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203881.86817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203881.86903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203881.88890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203881.88917: stderr chunk (state=3): >>><<< 15896 1727203881.88921: stdout chunk (state=3): >>><<< 15896 1727203881.88936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203881.88942: handler run complete 15896 1727203881.89055: variable 'ansible_facts' from source: unknown 15896 1727203881.89149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203881.89424: variable 'ansible_facts' from source: unknown 15896 1727203881.89507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203881.89620: attempt loop complete, returning result 15896 1727203881.89624: _execute() done 15896 1727203881.89626: dumping result to json 15896 1727203881.89660: done dumping result, returning 15896 1727203881.89671: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-fb83-b6ad-0000000006ca] 15896 1727203881.89674: sending task result for task 028d2410-947f-fb83-b6ad-0000000006ca 15896 1727203881.90373: done sending task result for task 028d2410-947f-fb83-b6ad-0000000006ca 15896 1727203881.90379: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203881.90434: no more pending results, returning what we have 15896 1727203881.90436: results queue empty 15896 1727203881.90436: checking for any_errors_fatal 15896 1727203881.90439: done checking for any_errors_fatal 15896 1727203881.90439: checking for max_fail_percentage 15896 1727203881.90440: done checking for max_fail_percentage 15896 1727203881.90440: checking to see if all hosts have failed and the running result is not ok 15896 1727203881.90441: done checking to see if all hosts have failed 15896 1727203881.90441: getting the remaining hosts for this loop 15896 1727203881.90442: done getting the remaining hosts for this loop 15896 1727203881.90444: getting the next task for host managed-node1 15896 1727203881.90448: done getting next task for host managed-node1 15896 1727203881.90450: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203881.90453: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203881.90459: getting variables 15896 1727203881.90462: in VariableManager get_vars() 15896 1727203881.90495: Calling all_inventory to load vars for managed-node1 15896 1727203881.90497: Calling groups_inventory to load vars for managed-node1 15896 1727203881.90498: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203881.90505: Calling all_plugins_play to load vars for managed-node1 15896 1727203881.90508: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203881.90510: Calling groups_plugins_play to load vars for managed-node1 15896 1727203881.91188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203881.92066: done with get_vars() 15896 1727203881.92084: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:51:21 -0400 (0:00:01.986) 0:00:27.510 ***** 15896 1727203881.92155: entering _queue_task() for managed-node1/package_facts 15896 1727203881.92395: worker is 1 (out of 1 available) 15896 1727203881.92409: exiting _queue_task() for managed-node1/package_facts 15896 1727203881.92421: done queuing things up, now waiting for results queue to drain 15896 1727203881.92423: waiting for pending results... 15896 1727203881.92605: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203881.92705: in run() - task 028d2410-947f-fb83-b6ad-0000000006cb 15896 1727203881.92718: variable 'ansible_search_path' from source: unknown 15896 1727203881.92722: variable 'ansible_search_path' from source: unknown 15896 1727203881.92748: calling self._execute() 15896 1727203881.92825: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203881.92829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203881.92840: variable 'omit' from source: magic vars 15896 1727203881.93117: variable 'ansible_distribution_major_version' from source: facts 15896 1727203881.93126: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203881.93132: variable 'omit' from source: magic vars 15896 1727203881.93186: variable 'omit' from source: magic vars 15896 1727203881.93210: variable 'omit' from source: magic vars 15896 1727203881.93242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203881.93279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203881.93296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203881.93310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203881.93321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203881.93344: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203881.93347: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203881.93350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203881.93419: Set connection var ansible_shell_type to sh 15896 1727203881.93425: Set connection var ansible_connection to ssh 15896 1727203881.93430: Set connection var ansible_shell_executable to /bin/sh 15896 1727203881.93434: Set connection var ansible_pipelining to False 15896 1727203881.93440: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203881.93445: Set connection var ansible_timeout to 10 15896 1727203881.93464: variable 'ansible_shell_executable' from source: unknown 15896 1727203881.93468: variable 'ansible_connection' from source: unknown 15896 1727203881.93470: variable 'ansible_module_compression' from source: unknown 15896 1727203881.93473: variable 'ansible_shell_type' from source: unknown 15896 1727203881.93475: variable 'ansible_shell_executable' from source: unknown 15896 1727203881.93479: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203881.93481: variable 'ansible_pipelining' from source: unknown 15896 1727203881.93483: variable 'ansible_timeout' from source: unknown 15896 1727203881.93486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203881.93629: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203881.93637: variable 'omit' from source: magic vars 15896 1727203881.93642: starting attempt loop 15896 1727203881.93644: running the handler 15896 1727203881.93658: _low_level_execute_command(): starting 15896 1727203881.93680: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203881.94183: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203881.94187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203881.94190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203881.94194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203881.94249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203881.94252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203881.94259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203881.94340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203881.96112: stdout chunk (state=3): >>>/root <<< 15896 1727203881.96205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203881.96234: stderr chunk (state=3): >>><<< 15896 1727203881.96237: stdout chunk (state=3): >>><<< 15896 1727203881.96257: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203881.96272: _low_level_execute_command(): starting 15896 1727203881.96295: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128 `" && echo ansible-tmp-1727203881.9625673-18126-76345331409128="` echo /root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128 `" ) && sleep 0' 15896 1727203881.96698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203881.96702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203881.96724: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203881.96769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203881.96772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203881.96856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203881.98985: stdout chunk (state=3): >>>ansible-tmp-1727203881.9625673-18126-76345331409128=/root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128 <<< 15896 1727203881.99095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203881.99122: stderr chunk (state=3): >>><<< 15896 1727203881.99125: stdout chunk (state=3): >>><<< 15896 1727203881.99139: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203881.9625673-18126-76345331409128=/root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203881.99185: variable 'ansible_module_compression' from source: unknown 15896 1727203881.99224: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15896 1727203881.99278: variable 'ansible_facts' from source: unknown 15896 1727203881.99397: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/AnsiballZ_package_facts.py 15896 1727203881.99502: Sending initial data 15896 1727203881.99505: Sent initial data (161 bytes) 15896 1727203881.99955: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203881.99959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203881.99961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203881.99963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203881.99965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203882.00020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203882.00023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203882.00028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203882.00106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203882.01847: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203882.01921: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203882.01995: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpneexlsg6 /root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/AnsiballZ_package_facts.py <<< 15896 1727203882.01998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/AnsiballZ_package_facts.py" <<< 15896 1727203882.02066: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpneexlsg6" to remote "/root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/AnsiballZ_package_facts.py" <<< 15896 1727203882.02072: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/AnsiballZ_package_facts.py" <<< 15896 1727203882.03295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203882.03339: stderr chunk (state=3): >>><<< 15896 1727203882.03343: stdout chunk (state=3): >>><<< 15896 1727203882.03366: done transferring module to remote 15896 1727203882.03377: _low_level_execute_command(): starting 15896 1727203882.03380: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/ /root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/AnsiballZ_package_facts.py && sleep 0' 15896 1727203882.03822: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203882.03825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203882.03827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203882.03829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203882.03835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203882.03882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203882.03894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203882.03980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203882.05948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203882.05977: stderr chunk (state=3): >>><<< 15896 1727203882.05980: stdout chunk (state=3): >>><<< 15896 1727203882.05998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203882.06001: _low_level_execute_command(): starting 15896 1727203882.06003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/AnsiballZ_package_facts.py && sleep 0' 15896 1727203882.06445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203882.06448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203882.06450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203882.06452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203882.06455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203882.06508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203882.06514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203882.06516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203882.06600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203882.53917: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 15896 1727203882.54004: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 15896 1727203882.54186: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 15896 1727203882.54192: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 15896 1727203882.54212: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15896 1727203882.56285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203882.56303: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 15896 1727203882.56359: stderr chunk (state=3): >>><<< 15896 1727203882.56513: stdout chunk (state=3): >>><<< 15896 1727203882.56595: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203882.58827: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203882.58863: _low_level_execute_command(): starting 15896 1727203882.58878: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203881.9625673-18126-76345331409128/ > /dev/null 2>&1 && sleep 0' 15896 1727203882.59497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203882.59519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203882.59592: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203882.59619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203882.59644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203882.59685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203882.59777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203882.61870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203882.61874: stdout chunk (state=3): >>><<< 15896 1727203882.61879: stderr chunk (state=3): >>><<< 15896 1727203882.61897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203882.62084: handler run complete 15896 1727203882.62621: variable 'ansible_facts' from source: unknown 15896 1727203882.67669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203882.69483: variable 'ansible_facts' from source: unknown 15896 1727203882.69744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203882.70441: attempt loop complete, returning result 15896 1727203882.70450: _execute() done 15896 1727203882.70452: dumping result to json 15896 1727203882.70598: done dumping result, returning 15896 1727203882.70606: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-fb83-b6ad-0000000006cb] 15896 1727203882.70608: sending task result for task 028d2410-947f-fb83-b6ad-0000000006cb 15896 1727203882.75784: done sending task result for task 028d2410-947f-fb83-b6ad-0000000006cb 15896 1727203882.75788: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203882.75828: no more pending results, returning what we have 15896 1727203882.75830: results queue empty 15896 1727203882.75831: checking for any_errors_fatal 15896 1727203882.75833: done checking for any_errors_fatal 15896 1727203882.75833: checking for max_fail_percentage 15896 1727203882.75834: done checking for max_fail_percentage 15896 1727203882.75834: checking to see if all hosts have failed and the running result is not ok 15896 1727203882.75835: done checking to see if all hosts have failed 15896 1727203882.75835: getting the remaining hosts for this loop 15896 1727203882.75836: done getting the remaining hosts for this loop 15896 1727203882.75838: getting the next task for host managed-node1 15896 1727203882.75841: done getting next task for host managed-node1 15896 1727203882.75843: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203882.75847: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203882.75855: getting variables 15896 1727203882.75856: in VariableManager get_vars() 15896 1727203882.75880: Calling all_inventory to load vars for managed-node1 15896 1727203882.75882: Calling groups_inventory to load vars for managed-node1 15896 1727203882.75883: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203882.75888: Calling all_plugins_play to load vars for managed-node1 15896 1727203882.75889: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203882.75891: Calling groups_plugins_play to load vars for managed-node1 15896 1727203882.76517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203882.77365: done with get_vars() 15896 1727203882.77382: done getting variables 15896 1727203882.77416: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:51:22 -0400 (0:00:00.852) 0:00:28.363 ***** 15896 1727203882.77436: entering _queue_task() for managed-node1/debug 15896 1727203882.77716: worker is 1 (out of 1 available) 15896 1727203882.77729: exiting _queue_task() for managed-node1/debug 15896 1727203882.77740: done queuing things up, now waiting for results queue to drain 15896 1727203882.77742: waiting for pending results... 15896 1727203882.77925: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203882.78021: in run() - task 028d2410-947f-fb83-b6ad-00000000007c 15896 1727203882.78033: variable 'ansible_search_path' from source: unknown 15896 1727203882.78036: variable 'ansible_search_path' from source: unknown 15896 1727203882.78067: calling self._execute() 15896 1727203882.78142: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203882.78145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203882.78153: variable 'omit' from source: magic vars 15896 1727203882.78444: variable 'ansible_distribution_major_version' from source: facts 15896 1727203882.78452: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203882.78459: variable 'omit' from source: magic vars 15896 1727203882.78499: variable 'omit' from source: magic vars 15896 1727203882.78570: variable 'network_provider' from source: set_fact 15896 1727203882.78585: variable 'omit' from source: magic vars 15896 1727203882.78620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203882.78648: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203882.78666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203882.78681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203882.78691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203882.78714: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203882.78717: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203882.78722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203882.78790: Set connection var ansible_shell_type to sh 15896 1727203882.78796: Set connection var ansible_connection to ssh 15896 1727203882.78801: Set connection var ansible_shell_executable to /bin/sh 15896 1727203882.78806: Set connection var ansible_pipelining to False 15896 1727203882.78811: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203882.78816: Set connection var ansible_timeout to 10 15896 1727203882.78834: variable 'ansible_shell_executable' from source: unknown 15896 1727203882.78838: variable 'ansible_connection' from source: unknown 15896 1727203882.78841: variable 'ansible_module_compression' from source: unknown 15896 1727203882.78843: variable 'ansible_shell_type' from source: unknown 15896 1727203882.78846: variable 'ansible_shell_executable' from source: unknown 15896 1727203882.78848: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203882.78850: variable 'ansible_pipelining' from source: unknown 15896 1727203882.78852: variable 'ansible_timeout' from source: unknown 15896 1727203882.78855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203882.78952: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203882.78969: variable 'omit' from source: magic vars 15896 1727203882.78981: starting attempt loop 15896 1727203882.78984: running the handler 15896 1727203882.79008: handler run complete 15896 1727203882.79018: attempt loop complete, returning result 15896 1727203882.79021: _execute() done 15896 1727203882.79024: dumping result to json 15896 1727203882.79027: done dumping result, returning 15896 1727203882.79034: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-fb83-b6ad-00000000007c] 15896 1727203882.79038: sending task result for task 028d2410-947f-fb83-b6ad-00000000007c 15896 1727203882.79122: done sending task result for task 028d2410-947f-fb83-b6ad-00000000007c 15896 1727203882.79125: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 15896 1727203882.79192: no more pending results, returning what we have 15896 1727203882.79195: results queue empty 15896 1727203882.79196: checking for any_errors_fatal 15896 1727203882.79208: done checking for any_errors_fatal 15896 1727203882.79208: checking for max_fail_percentage 15896 1727203882.79210: done checking for max_fail_percentage 15896 1727203882.79210: checking to see if all hosts have failed and the running result is not ok 15896 1727203882.79211: done checking to see if all hosts have failed 15896 1727203882.79211: getting the remaining hosts for this loop 15896 1727203882.79213: done getting the remaining hosts for this loop 15896 1727203882.79216: getting the next task for host managed-node1 15896 1727203882.79222: done getting next task for host managed-node1 15896 1727203882.79225: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203882.79228: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203882.79240: getting variables 15896 1727203882.79241: in VariableManager get_vars() 15896 1727203882.79291: Calling all_inventory to load vars for managed-node1 15896 1727203882.79294: Calling groups_inventory to load vars for managed-node1 15896 1727203882.79295: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203882.79304: Calling all_plugins_play to load vars for managed-node1 15896 1727203882.79306: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203882.79309: Calling groups_plugins_play to load vars for managed-node1 15896 1727203882.80174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203882.81046: done with get_vars() 15896 1727203882.81065: done getting variables 15896 1727203882.81109: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:51:22 -0400 (0:00:00.036) 0:00:28.400 ***** 15896 1727203882.81135: entering _queue_task() for managed-node1/fail 15896 1727203882.81392: worker is 1 (out of 1 available) 15896 1727203882.81405: exiting _queue_task() for managed-node1/fail 15896 1727203882.81415: done queuing things up, now waiting for results queue to drain 15896 1727203882.81416: waiting for pending results... 15896 1727203882.81602: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203882.81696: in run() - task 028d2410-947f-fb83-b6ad-00000000007d 15896 1727203882.81707: variable 'ansible_search_path' from source: unknown 15896 1727203882.81710: variable 'ansible_search_path' from source: unknown 15896 1727203882.81740: calling self._execute() 15896 1727203882.81822: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203882.81826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203882.81835: variable 'omit' from source: magic vars 15896 1727203882.82111: variable 'ansible_distribution_major_version' from source: facts 15896 1727203882.82120: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203882.82207: variable 'network_state' from source: role '' defaults 15896 1727203882.82213: Evaluated conditional (network_state != {}): False 15896 1727203882.82216: when evaluation is False, skipping this task 15896 1727203882.82219: _execute() done 15896 1727203882.82222: dumping result to json 15896 1727203882.82224: done dumping result, returning 15896 1727203882.82231: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-fb83-b6ad-00000000007d] 15896 1727203882.82237: sending task result for task 028d2410-947f-fb83-b6ad-00000000007d 15896 1727203882.82329: done sending task result for task 028d2410-947f-fb83-b6ad-00000000007d 15896 1727203882.82332: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203882.82380: no more pending results, returning what we have 15896 1727203882.82383: results queue empty 15896 1727203882.82384: checking for any_errors_fatal 15896 1727203882.82392: done checking for any_errors_fatal 15896 1727203882.82392: checking for max_fail_percentage 15896 1727203882.82394: done checking for max_fail_percentage 15896 1727203882.82395: checking to see if all hosts have failed and the running result is not ok 15896 1727203882.82395: done checking to see if all hosts have failed 15896 1727203882.82396: getting the remaining hosts for this loop 15896 1727203882.82398: done getting the remaining hosts for this loop 15896 1727203882.82401: getting the next task for host managed-node1 15896 1727203882.82407: done getting next task for host managed-node1 15896 1727203882.82411: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203882.82414: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203882.82433: getting variables 15896 1727203882.82435: in VariableManager get_vars() 15896 1727203882.82483: Calling all_inventory to load vars for managed-node1 15896 1727203882.82486: Calling groups_inventory to load vars for managed-node1 15896 1727203882.82488: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203882.82496: Calling all_plugins_play to load vars for managed-node1 15896 1727203882.82498: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203882.82500: Calling groups_plugins_play to load vars for managed-node1 15896 1727203882.83258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203882.84138: done with get_vars() 15896 1727203882.84154: done getting variables 15896 1727203882.84204: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:51:22 -0400 (0:00:00.030) 0:00:28.431 ***** 15896 1727203882.84228: entering _queue_task() for managed-node1/fail 15896 1727203882.84488: worker is 1 (out of 1 available) 15896 1727203882.84501: exiting _queue_task() for managed-node1/fail 15896 1727203882.84513: done queuing things up, now waiting for results queue to drain 15896 1727203882.84515: waiting for pending results... 15896 1727203882.84698: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203882.84791: in run() - task 028d2410-947f-fb83-b6ad-00000000007e 15896 1727203882.84802: variable 'ansible_search_path' from source: unknown 15896 1727203882.84806: variable 'ansible_search_path' from source: unknown 15896 1727203882.84833: calling self._execute() 15896 1727203882.84917: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203882.84921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203882.84931: variable 'omit' from source: magic vars 15896 1727203882.85207: variable 'ansible_distribution_major_version' from source: facts 15896 1727203882.85217: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203882.85303: variable 'network_state' from source: role '' defaults 15896 1727203882.85312: Evaluated conditional (network_state != {}): False 15896 1727203882.85315: when evaluation is False, skipping this task 15896 1727203882.85318: _execute() done 15896 1727203882.85321: dumping result to json 15896 1727203882.85323: done dumping result, returning 15896 1727203882.85330: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-fb83-b6ad-00000000007e] 15896 1727203882.85335: sending task result for task 028d2410-947f-fb83-b6ad-00000000007e 15896 1727203882.85423: done sending task result for task 028d2410-947f-fb83-b6ad-00000000007e 15896 1727203882.85427: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203882.85478: no more pending results, returning what we have 15896 1727203882.85482: results queue empty 15896 1727203882.85482: checking for any_errors_fatal 15896 1727203882.85489: done checking for any_errors_fatal 15896 1727203882.85490: checking for max_fail_percentage 15896 1727203882.85491: done checking for max_fail_percentage 15896 1727203882.85492: checking to see if all hosts have failed and the running result is not ok 15896 1727203882.85493: done checking to see if all hosts have failed 15896 1727203882.85493: getting the remaining hosts for this loop 15896 1727203882.85495: done getting the remaining hosts for this loop 15896 1727203882.85498: getting the next task for host managed-node1 15896 1727203882.85504: done getting next task for host managed-node1 15896 1727203882.85508: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203882.85511: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203882.85528: getting variables 15896 1727203882.85530: in VariableManager get_vars() 15896 1727203882.85580: Calling all_inventory to load vars for managed-node1 15896 1727203882.85583: Calling groups_inventory to load vars for managed-node1 15896 1727203882.85585: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203882.85593: Calling all_plugins_play to load vars for managed-node1 15896 1727203882.85595: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203882.85598: Calling groups_plugins_play to load vars for managed-node1 15896 1727203882.86493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203882.87350: done with get_vars() 15896 1727203882.87368: done getting variables 15896 1727203882.87415: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:51:22 -0400 (0:00:00.032) 0:00:28.463 ***** 15896 1727203882.87438: entering _queue_task() for managed-node1/fail 15896 1727203882.87687: worker is 1 (out of 1 available) 15896 1727203882.87700: exiting _queue_task() for managed-node1/fail 15896 1727203882.87712: done queuing things up, now waiting for results queue to drain 15896 1727203882.87714: waiting for pending results... 15896 1727203882.87905: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203882.88008: in run() - task 028d2410-947f-fb83-b6ad-00000000007f 15896 1727203882.88019: variable 'ansible_search_path' from source: unknown 15896 1727203882.88023: variable 'ansible_search_path' from source: unknown 15896 1727203882.88052: calling self._execute() 15896 1727203882.88131: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203882.88134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203882.88144: variable 'omit' from source: magic vars 15896 1727203882.88423: variable 'ansible_distribution_major_version' from source: facts 15896 1727203882.88433: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203882.88554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203882.90065: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203882.90114: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203882.90144: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203882.90170: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203882.90192: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203882.90253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203882.90287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203882.90305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.90335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203882.90346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203882.90421: variable 'ansible_distribution_major_version' from source: facts 15896 1727203882.90435: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15896 1727203882.90520: variable 'ansible_distribution' from source: facts 15896 1727203882.90524: variable '__network_rh_distros' from source: role '' defaults 15896 1727203882.90532: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15896 1727203882.90699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203882.90716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203882.90733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.90761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203882.90774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203882.90807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203882.90823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203882.90839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.90865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203882.90886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203882.90915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203882.90931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203882.90949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.90980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203882.90998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203882.91188: variable 'network_connections' from source: task vars 15896 1727203882.91200: variable 'controller_profile' from source: play vars 15896 1727203882.91246: variable 'controller_profile' from source: play vars 15896 1727203882.91254: variable 'network_state' from source: role '' defaults 15896 1727203882.91305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203882.91419: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203882.91445: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203882.91469: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203882.91492: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203882.91526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203882.91541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203882.91565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.91582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203882.91602: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15896 1727203882.91605: when evaluation is False, skipping this task 15896 1727203882.91607: _execute() done 15896 1727203882.91611: dumping result to json 15896 1727203882.91614: done dumping result, returning 15896 1727203882.91623: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-fb83-b6ad-00000000007f] 15896 1727203882.91626: sending task result for task 028d2410-947f-fb83-b6ad-00000000007f 15896 1727203882.91715: done sending task result for task 028d2410-947f-fb83-b6ad-00000000007f 15896 1727203882.91718: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15896 1727203882.91784: no more pending results, returning what we have 15896 1727203882.91787: results queue empty 15896 1727203882.91788: checking for any_errors_fatal 15896 1727203882.91795: done checking for any_errors_fatal 15896 1727203882.91796: checking for max_fail_percentage 15896 1727203882.91797: done checking for max_fail_percentage 15896 1727203882.91798: checking to see if all hosts have failed and the running result is not ok 15896 1727203882.91799: done checking to see if all hosts have failed 15896 1727203882.91799: getting the remaining hosts for this loop 15896 1727203882.91801: done getting the remaining hosts for this loop 15896 1727203882.91804: getting the next task for host managed-node1 15896 1727203882.91811: done getting next task for host managed-node1 15896 1727203882.91814: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203882.91817: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203882.91834: getting variables 15896 1727203882.91836: in VariableManager get_vars() 15896 1727203882.91891: Calling all_inventory to load vars for managed-node1 15896 1727203882.91893: Calling groups_inventory to load vars for managed-node1 15896 1727203882.91895: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203882.91904: Calling all_plugins_play to load vars for managed-node1 15896 1727203882.91907: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203882.91909: Calling groups_plugins_play to load vars for managed-node1 15896 1727203882.92705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203882.93585: done with get_vars() 15896 1727203882.93601: done getting variables 15896 1727203882.93644: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:51:22 -0400 (0:00:00.062) 0:00:28.526 ***** 15896 1727203882.93669: entering _queue_task() for managed-node1/dnf 15896 1727203882.93914: worker is 1 (out of 1 available) 15896 1727203882.93928: exiting _queue_task() for managed-node1/dnf 15896 1727203882.93940: done queuing things up, now waiting for results queue to drain 15896 1727203882.93941: waiting for pending results... 15896 1727203882.94119: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203882.94209: in run() - task 028d2410-947f-fb83-b6ad-000000000080 15896 1727203882.94220: variable 'ansible_search_path' from source: unknown 15896 1727203882.94223: variable 'ansible_search_path' from source: unknown 15896 1727203882.94251: calling self._execute() 15896 1727203882.94329: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203882.94333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203882.94342: variable 'omit' from source: magic vars 15896 1727203882.94610: variable 'ansible_distribution_major_version' from source: facts 15896 1727203882.94620: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203882.94755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203882.96445: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203882.96491: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203882.96517: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203882.96541: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203882.96569: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203882.96621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203882.96641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203882.96660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.96694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203882.96705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203882.96788: variable 'ansible_distribution' from source: facts 15896 1727203882.96791: variable 'ansible_distribution_major_version' from source: facts 15896 1727203882.96804: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15896 1727203882.96878: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203882.96960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203882.96979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203882.97001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.97024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203882.97038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203882.97064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203882.97086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203882.97107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.97130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203882.97141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203882.97168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203882.97186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203882.97202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.97231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203882.97242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203882.97354: variable 'network_connections' from source: task vars 15896 1727203882.97366: variable 'controller_profile' from source: play vars 15896 1727203882.97410: variable 'controller_profile' from source: play vars 15896 1727203882.97464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203882.97574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203882.97602: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203882.97624: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203882.97645: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203882.97680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203882.97696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203882.97717: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203882.97734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203882.97880: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203882.97921: variable 'network_connections' from source: task vars 15896 1727203882.97924: variable 'controller_profile' from source: play vars 15896 1727203882.97968: variable 'controller_profile' from source: play vars 15896 1727203882.97992: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203882.97996: when evaluation is False, skipping this task 15896 1727203882.97998: _execute() done 15896 1727203882.98001: dumping result to json 15896 1727203882.98003: done dumping result, returning 15896 1727203882.98010: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000080] 15896 1727203882.98015: sending task result for task 028d2410-947f-fb83-b6ad-000000000080 15896 1727203882.98115: done sending task result for task 028d2410-947f-fb83-b6ad-000000000080 15896 1727203882.98118: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203882.98168: no more pending results, returning what we have 15896 1727203882.98171: results queue empty 15896 1727203882.98172: checking for any_errors_fatal 15896 1727203882.98178: done checking for any_errors_fatal 15896 1727203882.98179: checking for max_fail_percentage 15896 1727203882.98181: done checking for max_fail_percentage 15896 1727203882.98181: checking to see if all hosts have failed and the running result is not ok 15896 1727203882.98182: done checking to see if all hosts have failed 15896 1727203882.98182: getting the remaining hosts for this loop 15896 1727203882.98184: done getting the remaining hosts for this loop 15896 1727203882.98188: getting the next task for host managed-node1 15896 1727203882.98194: done getting next task for host managed-node1 15896 1727203882.98198: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203882.98200: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203882.98217: getting variables 15896 1727203882.98218: in VariableManager get_vars() 15896 1727203882.98271: Calling all_inventory to load vars for managed-node1 15896 1727203882.98274: Calling groups_inventory to load vars for managed-node1 15896 1727203882.98283: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203882.98293: Calling all_plugins_play to load vars for managed-node1 15896 1727203882.98295: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203882.98298: Calling groups_plugins_play to load vars for managed-node1 15896 1727203882.99200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203883.00061: done with get_vars() 15896 1727203883.00082: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203883.00140: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:51:23 -0400 (0:00:00.064) 0:00:28.590 ***** 15896 1727203883.00164: entering _queue_task() for managed-node1/yum 15896 1727203883.00432: worker is 1 (out of 1 available) 15896 1727203883.00445: exiting _queue_task() for managed-node1/yum 15896 1727203883.00457: done queuing things up, now waiting for results queue to drain 15896 1727203883.00459: waiting for pending results... 15896 1727203883.00649: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203883.00754: in run() - task 028d2410-947f-fb83-b6ad-000000000081 15896 1727203883.00768: variable 'ansible_search_path' from source: unknown 15896 1727203883.00771: variable 'ansible_search_path' from source: unknown 15896 1727203883.00803: calling self._execute() 15896 1727203883.00883: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203883.00886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203883.00898: variable 'omit' from source: magic vars 15896 1727203883.01171: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.01182: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203883.01304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203883.02826: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203883.02877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203883.02904: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203883.02929: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203883.02947: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203883.03012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.03041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.03061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.03094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.03105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.03186: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.03195: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15896 1727203883.03198: when evaluation is False, skipping this task 15896 1727203883.03200: _execute() done 15896 1727203883.03203: dumping result to json 15896 1727203883.03208: done dumping result, returning 15896 1727203883.03216: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000081] 15896 1727203883.03221: sending task result for task 028d2410-947f-fb83-b6ad-000000000081 15896 1727203883.03316: done sending task result for task 028d2410-947f-fb83-b6ad-000000000081 15896 1727203883.03318: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15896 1727203883.03370: no more pending results, returning what we have 15896 1727203883.03374: results queue empty 15896 1727203883.03374: checking for any_errors_fatal 15896 1727203883.03383: done checking for any_errors_fatal 15896 1727203883.03383: checking for max_fail_percentage 15896 1727203883.03385: done checking for max_fail_percentage 15896 1727203883.03386: checking to see if all hosts have failed and the running result is not ok 15896 1727203883.03386: done checking to see if all hosts have failed 15896 1727203883.03387: getting the remaining hosts for this loop 15896 1727203883.03388: done getting the remaining hosts for this loop 15896 1727203883.03392: getting the next task for host managed-node1 15896 1727203883.03398: done getting next task for host managed-node1 15896 1727203883.03402: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203883.03405: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203883.03422: getting variables 15896 1727203883.03424: in VariableManager get_vars() 15896 1727203883.03485: Calling all_inventory to load vars for managed-node1 15896 1727203883.03488: Calling groups_inventory to load vars for managed-node1 15896 1727203883.03490: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203883.03501: Calling all_plugins_play to load vars for managed-node1 15896 1727203883.03503: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203883.03505: Calling groups_plugins_play to load vars for managed-node1 15896 1727203883.04306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203883.05273: done with get_vars() 15896 1727203883.05291: done getting variables 15896 1727203883.05333: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:51:23 -0400 (0:00:00.051) 0:00:28.642 ***** 15896 1727203883.05355: entering _queue_task() for managed-node1/fail 15896 1727203883.05605: worker is 1 (out of 1 available) 15896 1727203883.05619: exiting _queue_task() for managed-node1/fail 15896 1727203883.05631: done queuing things up, now waiting for results queue to drain 15896 1727203883.05633: waiting for pending results... 15896 1727203883.05822: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203883.05921: in run() - task 028d2410-947f-fb83-b6ad-000000000082 15896 1727203883.05933: variable 'ansible_search_path' from source: unknown 15896 1727203883.05937: variable 'ansible_search_path' from source: unknown 15896 1727203883.05974: calling self._execute() 15896 1727203883.06045: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203883.06051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203883.06059: variable 'omit' from source: magic vars 15896 1727203883.06340: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.06349: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203883.06436: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203883.06569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203883.08051: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203883.08098: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203883.08125: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203883.08150: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203883.08178: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203883.08233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.08269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.08291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.08316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.08327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.08359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.08381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.08399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.08422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.08433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.08463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.08493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.08502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.08526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.08536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.08652: variable 'network_connections' from source: task vars 15896 1727203883.08666: variable 'controller_profile' from source: play vars 15896 1727203883.08713: variable 'controller_profile' from source: play vars 15896 1727203883.08764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203883.08871: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203883.08899: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203883.08922: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203883.08944: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203883.08977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203883.08993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203883.09010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.09031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203883.09067: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203883.09218: variable 'network_connections' from source: task vars 15896 1727203883.09222: variable 'controller_profile' from source: play vars 15896 1727203883.09266: variable 'controller_profile' from source: play vars 15896 1727203883.09287: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203883.09290: when evaluation is False, skipping this task 15896 1727203883.09293: _execute() done 15896 1727203883.09296: dumping result to json 15896 1727203883.09298: done dumping result, returning 15896 1727203883.09305: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000082] 15896 1727203883.09309: sending task result for task 028d2410-947f-fb83-b6ad-000000000082 15896 1727203883.09401: done sending task result for task 028d2410-947f-fb83-b6ad-000000000082 15896 1727203883.09404: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203883.09451: no more pending results, returning what we have 15896 1727203883.09455: results queue empty 15896 1727203883.09456: checking for any_errors_fatal 15896 1727203883.09463: done checking for any_errors_fatal 15896 1727203883.09463: checking for max_fail_percentage 15896 1727203883.09465: done checking for max_fail_percentage 15896 1727203883.09465: checking to see if all hosts have failed and the running result is not ok 15896 1727203883.09466: done checking to see if all hosts have failed 15896 1727203883.09467: getting the remaining hosts for this loop 15896 1727203883.09468: done getting the remaining hosts for this loop 15896 1727203883.09471: getting the next task for host managed-node1 15896 1727203883.09478: done getting next task for host managed-node1 15896 1727203883.09483: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15896 1727203883.09485: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203883.09503: getting variables 15896 1727203883.09505: in VariableManager get_vars() 15896 1727203883.09556: Calling all_inventory to load vars for managed-node1 15896 1727203883.09558: Calling groups_inventory to load vars for managed-node1 15896 1727203883.09563: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203883.09572: Calling all_plugins_play to load vars for managed-node1 15896 1727203883.09575: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203883.09584: Calling groups_plugins_play to load vars for managed-node1 15896 1727203883.10360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203883.11218: done with get_vars() 15896 1727203883.11234: done getting variables 15896 1727203883.11277: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:51:23 -0400 (0:00:00.059) 0:00:28.702 ***** 15896 1727203883.11302: entering _queue_task() for managed-node1/package 15896 1727203883.11532: worker is 1 (out of 1 available) 15896 1727203883.11544: exiting _queue_task() for managed-node1/package 15896 1727203883.11555: done queuing things up, now waiting for results queue to drain 15896 1727203883.11557: waiting for pending results... 15896 1727203883.11743: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 15896 1727203883.11844: in run() - task 028d2410-947f-fb83-b6ad-000000000083 15896 1727203883.11856: variable 'ansible_search_path' from source: unknown 15896 1727203883.11860: variable 'ansible_search_path' from source: unknown 15896 1727203883.11895: calling self._execute() 15896 1727203883.11963: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203883.11970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203883.11980: variable 'omit' from source: magic vars 15896 1727203883.12249: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.12259: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203883.12396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203883.12588: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203883.12621: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203883.12647: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203883.12702: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203883.12780: variable 'network_packages' from source: role '' defaults 15896 1727203883.12852: variable '__network_provider_setup' from source: role '' defaults 15896 1727203883.12859: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203883.12911: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203883.12919: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203883.12961: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203883.13088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203883.14684: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203883.14727: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203883.14755: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203883.14782: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203883.14802: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203883.14861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.14885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.14903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.14928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.14941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.14979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.14995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.15011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.15035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.15046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.15196: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203883.15272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.15292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.15309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.15333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.15343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.15410: variable 'ansible_python' from source: facts 15896 1727203883.15431: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203883.15492: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203883.15546: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203883.15632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.15648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.15664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.15696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.15706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.15740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.15759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.15778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.15802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.15813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.15909: variable 'network_connections' from source: task vars 15896 1727203883.15915: variable 'controller_profile' from source: play vars 15896 1727203883.15987: variable 'controller_profile' from source: play vars 15896 1727203883.16042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203883.16058: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203883.16081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.16102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203883.16139: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203883.16320: variable 'network_connections' from source: task vars 15896 1727203883.16323: variable 'controller_profile' from source: play vars 15896 1727203883.16395: variable 'controller_profile' from source: play vars 15896 1727203883.16418: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203883.16472: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203883.16680: variable 'network_connections' from source: task vars 15896 1727203883.16683: variable 'controller_profile' from source: play vars 15896 1727203883.16731: variable 'controller_profile' from source: play vars 15896 1727203883.16747: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203883.16804: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203883.16993: variable 'network_connections' from source: task vars 15896 1727203883.16997: variable 'controller_profile' from source: play vars 15896 1727203883.17046: variable 'controller_profile' from source: play vars 15896 1727203883.17085: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203883.17126: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203883.17138: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203883.17177: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203883.17309: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203883.17603: variable 'network_connections' from source: task vars 15896 1727203883.17607: variable 'controller_profile' from source: play vars 15896 1727203883.17648: variable 'controller_profile' from source: play vars 15896 1727203883.17654: variable 'ansible_distribution' from source: facts 15896 1727203883.17657: variable '__network_rh_distros' from source: role '' defaults 15896 1727203883.17664: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.17685: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203883.17791: variable 'ansible_distribution' from source: facts 15896 1727203883.17794: variable '__network_rh_distros' from source: role '' defaults 15896 1727203883.17797: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.17809: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203883.18082: variable 'ansible_distribution' from source: facts 15896 1727203883.18085: variable '__network_rh_distros' from source: role '' defaults 15896 1727203883.18088: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.18092: variable 'network_provider' from source: set_fact 15896 1727203883.18094: variable 'ansible_facts' from source: unknown 15896 1727203883.18490: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15896 1727203883.18494: when evaluation is False, skipping this task 15896 1727203883.18496: _execute() done 15896 1727203883.18499: dumping result to json 15896 1727203883.18501: done dumping result, returning 15896 1727203883.18508: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-fb83-b6ad-000000000083] 15896 1727203883.18513: sending task result for task 028d2410-947f-fb83-b6ad-000000000083 15896 1727203883.18606: done sending task result for task 028d2410-947f-fb83-b6ad-000000000083 15896 1727203883.18609: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15896 1727203883.18659: no more pending results, returning what we have 15896 1727203883.18662: results queue empty 15896 1727203883.18663: checking for any_errors_fatal 15896 1727203883.18671: done checking for any_errors_fatal 15896 1727203883.18671: checking for max_fail_percentage 15896 1727203883.18673: done checking for max_fail_percentage 15896 1727203883.18674: checking to see if all hosts have failed and the running result is not ok 15896 1727203883.18674: done checking to see if all hosts have failed 15896 1727203883.18676: getting the remaining hosts for this loop 15896 1727203883.18678: done getting the remaining hosts for this loop 15896 1727203883.18681: getting the next task for host managed-node1 15896 1727203883.18688: done getting next task for host managed-node1 15896 1727203883.18692: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203883.18695: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203883.18711: getting variables 15896 1727203883.18713: in VariableManager get_vars() 15896 1727203883.18763: Calling all_inventory to load vars for managed-node1 15896 1727203883.18765: Calling groups_inventory to load vars for managed-node1 15896 1727203883.18768: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203883.18784: Calling all_plugins_play to load vars for managed-node1 15896 1727203883.18787: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203883.18791: Calling groups_plugins_play to load vars for managed-node1 15896 1727203883.19727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203883.21147: done with get_vars() 15896 1727203883.21173: done getting variables 15896 1727203883.21234: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:51:23 -0400 (0:00:00.099) 0:00:28.801 ***** 15896 1727203883.21267: entering _queue_task() for managed-node1/package 15896 1727203883.21608: worker is 1 (out of 1 available) 15896 1727203883.21621: exiting _queue_task() for managed-node1/package 15896 1727203883.21633: done queuing things up, now waiting for results queue to drain 15896 1727203883.21635: waiting for pending results... 15896 1727203883.22005: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203883.22100: in run() - task 028d2410-947f-fb83-b6ad-000000000084 15896 1727203883.22282: variable 'ansible_search_path' from source: unknown 15896 1727203883.22286: variable 'ansible_search_path' from source: unknown 15896 1727203883.22289: calling self._execute() 15896 1727203883.22292: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203883.22295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203883.22297: variable 'omit' from source: magic vars 15896 1727203883.22660: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.22681: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203883.22806: variable 'network_state' from source: role '' defaults 15896 1727203883.22820: Evaluated conditional (network_state != {}): False 15896 1727203883.22827: when evaluation is False, skipping this task 15896 1727203883.22833: _execute() done 15896 1727203883.22840: dumping result to json 15896 1727203883.22854: done dumping result, returning 15896 1727203883.22865: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-fb83-b6ad-000000000084] 15896 1727203883.22874: sending task result for task 028d2410-947f-fb83-b6ad-000000000084 15896 1727203883.23106: done sending task result for task 028d2410-947f-fb83-b6ad-000000000084 15896 1727203883.23110: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203883.23160: no more pending results, returning what we have 15896 1727203883.23164: results queue empty 15896 1727203883.23165: checking for any_errors_fatal 15896 1727203883.23173: done checking for any_errors_fatal 15896 1727203883.23174: checking for max_fail_percentage 15896 1727203883.23179: done checking for max_fail_percentage 15896 1727203883.23179: checking to see if all hosts have failed and the running result is not ok 15896 1727203883.23180: done checking to see if all hosts have failed 15896 1727203883.23181: getting the remaining hosts for this loop 15896 1727203883.23182: done getting the remaining hosts for this loop 15896 1727203883.23186: getting the next task for host managed-node1 15896 1727203883.23193: done getting next task for host managed-node1 15896 1727203883.23197: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203883.23201: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203883.23221: getting variables 15896 1727203883.23223: in VariableManager get_vars() 15896 1727203883.23327: Calling all_inventory to load vars for managed-node1 15896 1727203883.23330: Calling groups_inventory to load vars for managed-node1 15896 1727203883.23332: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203883.23343: Calling all_plugins_play to load vars for managed-node1 15896 1727203883.23345: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203883.23348: Calling groups_plugins_play to load vars for managed-node1 15896 1727203883.24175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203883.25037: done with get_vars() 15896 1727203883.25056: done getting variables 15896 1727203883.25101: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:51:23 -0400 (0:00:00.038) 0:00:28.840 ***** 15896 1727203883.25123: entering _queue_task() for managed-node1/package 15896 1727203883.25362: worker is 1 (out of 1 available) 15896 1727203883.25376: exiting _queue_task() for managed-node1/package 15896 1727203883.25389: done queuing things up, now waiting for results queue to drain 15896 1727203883.25391: waiting for pending results... 15896 1727203883.25695: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203883.25753: in run() - task 028d2410-947f-fb83-b6ad-000000000085 15896 1727203883.25774: variable 'ansible_search_path' from source: unknown 15896 1727203883.25785: variable 'ansible_search_path' from source: unknown 15896 1727203883.25828: calling self._execute() 15896 1727203883.25932: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203883.25944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203883.26010: variable 'omit' from source: magic vars 15896 1727203883.26349: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.26368: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203883.26500: variable 'network_state' from source: role '' defaults 15896 1727203883.26516: Evaluated conditional (network_state != {}): False 15896 1727203883.26523: when evaluation is False, skipping this task 15896 1727203883.26529: _execute() done 15896 1727203883.26536: dumping result to json 15896 1727203883.26543: done dumping result, returning 15896 1727203883.26561: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-fb83-b6ad-000000000085] 15896 1727203883.26668: sending task result for task 028d2410-947f-fb83-b6ad-000000000085 15896 1727203883.26777: done sending task result for task 028d2410-947f-fb83-b6ad-000000000085 15896 1727203883.26780: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203883.26827: no more pending results, returning what we have 15896 1727203883.26831: results queue empty 15896 1727203883.26832: checking for any_errors_fatal 15896 1727203883.26839: done checking for any_errors_fatal 15896 1727203883.26840: checking for max_fail_percentage 15896 1727203883.26842: done checking for max_fail_percentage 15896 1727203883.26843: checking to see if all hosts have failed and the running result is not ok 15896 1727203883.26844: done checking to see if all hosts have failed 15896 1727203883.26845: getting the remaining hosts for this loop 15896 1727203883.26846: done getting the remaining hosts for this loop 15896 1727203883.26849: getting the next task for host managed-node1 15896 1727203883.26856: done getting next task for host managed-node1 15896 1727203883.26860: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203883.26863: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203883.26886: getting variables 15896 1727203883.26888: in VariableManager get_vars() 15896 1727203883.26943: Calling all_inventory to load vars for managed-node1 15896 1727203883.26946: Calling groups_inventory to load vars for managed-node1 15896 1727203883.26949: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203883.26960: Calling all_plugins_play to load vars for managed-node1 15896 1727203883.26963: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203883.26966: Calling groups_plugins_play to load vars for managed-node1 15896 1727203883.28733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203883.30329: done with get_vars() 15896 1727203883.30358: done getting variables 15896 1727203883.30418: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:51:23 -0400 (0:00:00.053) 0:00:28.893 ***** 15896 1727203883.30459: entering _queue_task() for managed-node1/service 15896 1727203883.31007: worker is 1 (out of 1 available) 15896 1727203883.31017: exiting _queue_task() for managed-node1/service 15896 1727203883.31028: done queuing things up, now waiting for results queue to drain 15896 1727203883.31030: waiting for pending results... 15896 1727203883.31269: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203883.31368: in run() - task 028d2410-947f-fb83-b6ad-000000000086 15896 1727203883.31372: variable 'ansible_search_path' from source: unknown 15896 1727203883.31377: variable 'ansible_search_path' from source: unknown 15896 1727203883.31380: calling self._execute() 15896 1727203883.31457: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203883.31478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203883.31495: variable 'omit' from source: magic vars 15896 1727203883.31862: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.31881: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203883.32007: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203883.32213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203883.34501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203883.34573: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203883.34614: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203883.34661: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203883.34737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203883.34778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.34824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.34857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.34899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.34915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.34971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.35002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.35063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.35081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.35098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.35139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.35172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.35204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.35283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.35287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.35439: variable 'network_connections' from source: task vars 15896 1727203883.35458: variable 'controller_profile' from source: play vars 15896 1727203883.35537: variable 'controller_profile' from source: play vars 15896 1727203883.35623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203883.35825: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203883.35842: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203883.35879: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203883.35911: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203883.35963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203883.36042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203883.36045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.36047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203883.36099: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203883.36352: variable 'network_connections' from source: task vars 15896 1727203883.36369: variable 'controller_profile' from source: play vars 15896 1727203883.36432: variable 'controller_profile' from source: play vars 15896 1727203883.36461: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203883.36469: when evaluation is False, skipping this task 15896 1727203883.36486: _execute() done 15896 1727203883.36581: dumping result to json 15896 1727203883.36585: done dumping result, returning 15896 1727203883.36588: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000086] 15896 1727203883.36590: sending task result for task 028d2410-947f-fb83-b6ad-000000000086 15896 1727203883.36661: done sending task result for task 028d2410-947f-fb83-b6ad-000000000086 15896 1727203883.36670: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203883.36718: no more pending results, returning what we have 15896 1727203883.36721: results queue empty 15896 1727203883.36722: checking for any_errors_fatal 15896 1727203883.36728: done checking for any_errors_fatal 15896 1727203883.36729: checking for max_fail_percentage 15896 1727203883.36731: done checking for max_fail_percentage 15896 1727203883.36731: checking to see if all hosts have failed and the running result is not ok 15896 1727203883.36732: done checking to see if all hosts have failed 15896 1727203883.36733: getting the remaining hosts for this loop 15896 1727203883.36734: done getting the remaining hosts for this loop 15896 1727203883.36738: getting the next task for host managed-node1 15896 1727203883.36744: done getting next task for host managed-node1 15896 1727203883.36749: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203883.36752: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203883.36769: getting variables 15896 1727203883.36771: in VariableManager get_vars() 15896 1727203883.36826: Calling all_inventory to load vars for managed-node1 15896 1727203883.36829: Calling groups_inventory to load vars for managed-node1 15896 1727203883.36831: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203883.36842: Calling all_plugins_play to load vars for managed-node1 15896 1727203883.36845: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203883.36848: Calling groups_plugins_play to load vars for managed-node1 15896 1727203883.38488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203883.41183: done with get_vars() 15896 1727203883.41203: done getting variables 15896 1727203883.41257: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:51:23 -0400 (0:00:00.110) 0:00:29.004 ***** 15896 1727203883.41499: entering _queue_task() for managed-node1/service 15896 1727203883.42243: worker is 1 (out of 1 available) 15896 1727203883.42257: exiting _queue_task() for managed-node1/service 15896 1727203883.42269: done queuing things up, now waiting for results queue to drain 15896 1727203883.42271: waiting for pending results... 15896 1727203883.42896: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203883.43100: in run() - task 028d2410-947f-fb83-b6ad-000000000087 15896 1727203883.43104: variable 'ansible_search_path' from source: unknown 15896 1727203883.43107: variable 'ansible_search_path' from source: unknown 15896 1727203883.43116: calling self._execute() 15896 1727203883.43405: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203883.43409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203883.43504: variable 'omit' from source: magic vars 15896 1727203883.45182: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.45186: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203883.45218: variable 'network_provider' from source: set_fact 15896 1727203883.45234: variable 'network_state' from source: role '' defaults 15896 1727203883.45782: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15896 1727203883.45785: variable 'omit' from source: magic vars 15896 1727203883.45787: variable 'omit' from source: magic vars 15896 1727203883.45790: variable 'network_service_name' from source: role '' defaults 15896 1727203883.45792: variable 'network_service_name' from source: role '' defaults 15896 1727203883.45887: variable '__network_provider_setup' from source: role '' defaults 15896 1727203883.46089: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203883.46153: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203883.46171: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203883.46238: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203883.46655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203883.49036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203883.49109: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203883.49149: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203883.49207: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203883.49236: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203883.49319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.49512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.49558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.49608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.49628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.49679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.49909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.49986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.49994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.50015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.50412: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203883.50722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.50775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.50827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.50878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.50899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.50995: variable 'ansible_python' from source: facts 15896 1727203883.51023: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203883.51112: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203883.51192: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203883.51322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.51350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.51385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.51430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.51450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.51508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203883.51546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203883.51580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.51617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203883.51631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203883.51750: variable 'network_connections' from source: task vars 15896 1727203883.51764: variable 'controller_profile' from source: play vars 15896 1727203883.51839: variable 'controller_profile' from source: play vars 15896 1727203883.51941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203883.52125: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203883.52184: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203883.52232: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203883.52302: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203883.52374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203883.52410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203883.52445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203883.52487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203883.52535: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203883.53286: variable 'network_connections' from source: task vars 15896 1727203883.53480: variable 'controller_profile' from source: play vars 15896 1727203883.53483: variable 'controller_profile' from source: play vars 15896 1727203883.53486: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203883.53487: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203883.53953: variable 'network_connections' from source: task vars 15896 1727203883.54188: variable 'controller_profile' from source: play vars 15896 1727203883.54263: variable 'controller_profile' from source: play vars 15896 1727203883.54292: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203883.54374: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203883.54867: variable 'network_connections' from source: task vars 15896 1727203883.55087: variable 'controller_profile' from source: play vars 15896 1727203883.55159: variable 'controller_profile' from source: play vars 15896 1727203883.55307: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203883.55373: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203883.55391: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203883.55446: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203883.55628: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203883.56125: variable 'network_connections' from source: task vars 15896 1727203883.56136: variable 'controller_profile' from source: play vars 15896 1727203883.56202: variable 'controller_profile' from source: play vars 15896 1727203883.56216: variable 'ansible_distribution' from source: facts 15896 1727203883.56224: variable '__network_rh_distros' from source: role '' defaults 15896 1727203883.56234: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.56252: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203883.56429: variable 'ansible_distribution' from source: facts 15896 1727203883.56438: variable '__network_rh_distros' from source: role '' defaults 15896 1727203883.56448: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.56470: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203883.56647: variable 'ansible_distribution' from source: facts 15896 1727203883.56658: variable '__network_rh_distros' from source: role '' defaults 15896 1727203883.56671: variable 'ansible_distribution_major_version' from source: facts 15896 1727203883.56712: variable 'network_provider' from source: set_fact 15896 1727203883.56739: variable 'omit' from source: magic vars 15896 1727203883.56811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203883.56882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203883.57006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203883.57280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203883.57283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203883.57285: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203883.57287: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203883.57288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203883.57353: Set connection var ansible_shell_type to sh 15896 1727203883.57369: Set connection var ansible_connection to ssh 15896 1727203883.57582: Set connection var ansible_shell_executable to /bin/sh 15896 1727203883.57584: Set connection var ansible_pipelining to False 15896 1727203883.57586: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203883.57588: Set connection var ansible_timeout to 10 15896 1727203883.57590: variable 'ansible_shell_executable' from source: unknown 15896 1727203883.57592: variable 'ansible_connection' from source: unknown 15896 1727203883.57594: variable 'ansible_module_compression' from source: unknown 15896 1727203883.57596: variable 'ansible_shell_type' from source: unknown 15896 1727203883.57598: variable 'ansible_shell_executable' from source: unknown 15896 1727203883.57600: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203883.57602: variable 'ansible_pipelining' from source: unknown 15896 1727203883.57604: variable 'ansible_timeout' from source: unknown 15896 1727203883.57606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203883.57768: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203883.58087: variable 'omit' from source: magic vars 15896 1727203883.58091: starting attempt loop 15896 1727203883.58093: running the handler 15896 1727203883.58098: variable 'ansible_facts' from source: unknown 15896 1727203883.59631: _low_level_execute_command(): starting 15896 1727203883.59642: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203883.61088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203883.61189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203883.61299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203883.61504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203883.61632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203883.63424: stdout chunk (state=3): >>>/root <<< 15896 1727203883.63522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203883.63557: stderr chunk (state=3): >>><<< 15896 1727203883.63574: stdout chunk (state=3): >>><<< 15896 1727203883.63805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203883.63809: _low_level_execute_command(): starting 15896 1727203883.63812: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137 `" && echo ansible-tmp-1727203883.6371434-18163-183012365423137="` echo /root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137 `" ) && sleep 0' 15896 1727203883.64988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203883.65110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203883.65130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203883.65325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203883.67361: stdout chunk (state=3): >>>ansible-tmp-1727203883.6371434-18163-183012365423137=/root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137 <<< 15896 1727203883.67489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203883.67539: stderr chunk (state=3): >>><<< 15896 1727203883.67554: stdout chunk (state=3): >>><<< 15896 1727203883.67579: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203883.6371434-18163-183012365423137=/root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203883.67633: variable 'ansible_module_compression' from source: unknown 15896 1727203883.67786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15896 1727203883.67789: variable 'ansible_facts' from source: unknown 15896 1727203883.68025: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/AnsiballZ_systemd.py 15896 1727203883.68242: Sending initial data 15896 1727203883.68245: Sent initial data (156 bytes) 15896 1727203883.68998: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203883.69018: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203883.69086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203883.69137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203883.69154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203883.69188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203883.69319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203883.71058: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15896 1727203883.71089: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203883.71157: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203883.71241: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpyj1wjqxa /root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/AnsiballZ_systemd.py <<< 15896 1727203883.71244: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/AnsiballZ_systemd.py" <<< 15896 1727203883.71333: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpyj1wjqxa" to remote "/root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/AnsiballZ_systemd.py" <<< 15896 1727203883.73130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203883.73133: stdout chunk (state=3): >>><<< 15896 1727203883.73136: stderr chunk (state=3): >>><<< 15896 1727203883.73138: done transferring module to remote 15896 1727203883.73140: _low_level_execute_command(): starting 15896 1727203883.73143: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/ /root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/AnsiballZ_systemd.py && sleep 0' 15896 1727203883.73752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203883.73770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203883.73787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203883.73842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203883.73915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203883.73963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203883.73966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203883.74083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203883.76112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203883.76115: stdout chunk (state=3): >>><<< 15896 1727203883.76117: stderr chunk (state=3): >>><<< 15896 1727203883.76182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203883.76185: _low_level_execute_command(): starting 15896 1727203883.76188: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/AnsiballZ_systemd.py && sleep 0' 15896 1727203883.76855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203883.76880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203883.76905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203883.77029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203884.08153: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10616832", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303383040", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "747703000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 15896 1727203884.08205: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15896 1727203884.10782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203884.10792: stdout chunk (state=3): >>><<< 15896 1727203884.10796: stderr chunk (state=3): >>><<< 15896 1727203884.10804: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10616832", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303383040", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "747703000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203884.10976: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203884.10995: _low_level_execute_command(): starting 15896 1727203884.11000: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203883.6371434-18163-183012365423137/ > /dev/null 2>&1 && sleep 0' 15896 1727203884.11621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203884.11629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203884.11678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203884.11688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203884.11754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203884.11768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203884.11795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203884.11902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203884.14248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203884.14252: stdout chunk (state=3): >>><<< 15896 1727203884.14281: stderr chunk (state=3): >>><<< 15896 1727203884.14285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203884.14287: handler run complete 15896 1727203884.14399: attempt loop complete, returning result 15896 1727203884.14403: _execute() done 15896 1727203884.14481: dumping result to json 15896 1727203884.14484: done dumping result, returning 15896 1727203884.14487: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-fb83-b6ad-000000000087] 15896 1727203884.14492: sending task result for task 028d2410-947f-fb83-b6ad-000000000087 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203884.15129: no more pending results, returning what we have 15896 1727203884.15132: results queue empty 15896 1727203884.15133: checking for any_errors_fatal 15896 1727203884.15139: done checking for any_errors_fatal 15896 1727203884.15140: checking for max_fail_percentage 15896 1727203884.15141: done checking for max_fail_percentage 15896 1727203884.15142: checking to see if all hosts have failed and the running result is not ok 15896 1727203884.15143: done checking to see if all hosts have failed 15896 1727203884.15143: getting the remaining hosts for this loop 15896 1727203884.15145: done getting the remaining hosts for this loop 15896 1727203884.15148: getting the next task for host managed-node1 15896 1727203884.15154: done getting next task for host managed-node1 15896 1727203884.15157: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203884.15159: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203884.15172: getting variables 15896 1727203884.15174: in VariableManager get_vars() 15896 1727203884.15255: Calling all_inventory to load vars for managed-node1 15896 1727203884.15258: Calling groups_inventory to load vars for managed-node1 15896 1727203884.15264: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203884.15477: Calling all_plugins_play to load vars for managed-node1 15896 1727203884.15483: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203884.15487: Calling groups_plugins_play to load vars for managed-node1 15896 1727203884.16188: done sending task result for task 028d2410-947f-fb83-b6ad-000000000087 15896 1727203884.16192: WORKER PROCESS EXITING 15896 1727203884.17105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203884.18809: done with get_vars() 15896 1727203884.18836: done getting variables 15896 1727203884.18905: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:51:24 -0400 (0:00:00.774) 0:00:29.778 ***** 15896 1727203884.18937: entering _queue_task() for managed-node1/service 15896 1727203884.19392: worker is 1 (out of 1 available) 15896 1727203884.19405: exiting _queue_task() for managed-node1/service 15896 1727203884.19418: done queuing things up, now waiting for results queue to drain 15896 1727203884.19420: waiting for pending results... 15896 1727203884.19648: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203884.19929: in run() - task 028d2410-947f-fb83-b6ad-000000000088 15896 1727203884.19951: variable 'ansible_search_path' from source: unknown 15896 1727203884.19989: variable 'ansible_search_path' from source: unknown 15896 1727203884.20032: calling self._execute() 15896 1727203884.20491: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203884.20495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203884.20498: variable 'omit' from source: magic vars 15896 1727203884.21108: variable 'ansible_distribution_major_version' from source: facts 15896 1727203884.21126: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203884.21257: variable 'network_provider' from source: set_fact 15896 1727203884.21392: Evaluated conditional (network_provider == "nm"): True 15896 1727203884.21579: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203884.21772: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203884.22102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203884.26518: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203884.26713: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203884.26756: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203884.26805: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203884.26906: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203884.27709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203884.27818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203884.27981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203884.27985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203884.28008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203884.28131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203884.28164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203884.28210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203884.28443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203884.28447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203884.28450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203884.28574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203884.28607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203884.28650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203884.28879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203884.29040: variable 'network_connections' from source: task vars 15896 1727203884.29062: variable 'controller_profile' from source: play vars 15896 1727203884.29142: variable 'controller_profile' from source: play vars 15896 1727203884.29387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203884.29679: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203884.29822: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203884.29858: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203884.29953: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203884.30081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203884.30085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203884.30167: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203884.30200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203884.30303: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203884.30820: variable 'network_connections' from source: task vars 15896 1727203884.30981: variable 'controller_profile' from source: play vars 15896 1727203884.30984: variable 'controller_profile' from source: play vars 15896 1727203884.31224: Evaluated conditional (__network_wpa_supplicant_required): False 15896 1727203884.31227: when evaluation is False, skipping this task 15896 1727203884.31230: _execute() done 15896 1727203884.31233: dumping result to json 15896 1727203884.31235: done dumping result, returning 15896 1727203884.31237: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-fb83-b6ad-000000000088] 15896 1727203884.31247: sending task result for task 028d2410-947f-fb83-b6ad-000000000088 15896 1727203884.31322: done sending task result for task 028d2410-947f-fb83-b6ad-000000000088 15896 1727203884.31378: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15896 1727203884.31430: no more pending results, returning what we have 15896 1727203884.31433: results queue empty 15896 1727203884.31434: checking for any_errors_fatal 15896 1727203884.31455: done checking for any_errors_fatal 15896 1727203884.31456: checking for max_fail_percentage 15896 1727203884.31458: done checking for max_fail_percentage 15896 1727203884.31459: checking to see if all hosts have failed and the running result is not ok 15896 1727203884.31460: done checking to see if all hosts have failed 15896 1727203884.31463: getting the remaining hosts for this loop 15896 1727203884.31465: done getting the remaining hosts for this loop 15896 1727203884.31469: getting the next task for host managed-node1 15896 1727203884.31478: done getting next task for host managed-node1 15896 1727203884.31482: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203884.31485: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203884.31505: getting variables 15896 1727203884.31507: in VariableManager get_vars() 15896 1727203884.31568: Calling all_inventory to load vars for managed-node1 15896 1727203884.31571: Calling groups_inventory to load vars for managed-node1 15896 1727203884.31574: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203884.31790: Calling all_plugins_play to load vars for managed-node1 15896 1727203884.31794: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203884.31797: Calling groups_plugins_play to load vars for managed-node1 15896 1727203884.33591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203884.35317: done with get_vars() 15896 1727203884.35343: done getting variables 15896 1727203884.35410: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:51:24 -0400 (0:00:00.165) 0:00:29.943 ***** 15896 1727203884.35444: entering _queue_task() for managed-node1/service 15896 1727203884.36201: worker is 1 (out of 1 available) 15896 1727203884.36214: exiting _queue_task() for managed-node1/service 15896 1727203884.36229: done queuing things up, now waiting for results queue to drain 15896 1727203884.36230: waiting for pending results... 15896 1727203884.36547: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203884.36853: in run() - task 028d2410-947f-fb83-b6ad-000000000089 15896 1727203884.36857: variable 'ansible_search_path' from source: unknown 15896 1727203884.36861: variable 'ansible_search_path' from source: unknown 15896 1727203884.36863: calling self._execute() 15896 1727203884.36902: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203884.36913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203884.36927: variable 'omit' from source: magic vars 15896 1727203884.37299: variable 'ansible_distribution_major_version' from source: facts 15896 1727203884.37315: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203884.37432: variable 'network_provider' from source: set_fact 15896 1727203884.37445: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203884.37452: when evaluation is False, skipping this task 15896 1727203884.37460: _execute() done 15896 1727203884.37469: dumping result to json 15896 1727203884.37478: done dumping result, returning 15896 1727203884.37491: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-fb83-b6ad-000000000089] 15896 1727203884.37582: sending task result for task 028d2410-947f-fb83-b6ad-000000000089 15896 1727203884.37654: done sending task result for task 028d2410-947f-fb83-b6ad-000000000089 15896 1727203884.37658: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203884.37703: no more pending results, returning what we have 15896 1727203884.37708: results queue empty 15896 1727203884.37709: checking for any_errors_fatal 15896 1727203884.37716: done checking for any_errors_fatal 15896 1727203884.37717: checking for max_fail_percentage 15896 1727203884.37718: done checking for max_fail_percentage 15896 1727203884.37719: checking to see if all hosts have failed and the running result is not ok 15896 1727203884.37719: done checking to see if all hosts have failed 15896 1727203884.37720: getting the remaining hosts for this loop 15896 1727203884.37721: done getting the remaining hosts for this loop 15896 1727203884.37724: getting the next task for host managed-node1 15896 1727203884.37731: done getting next task for host managed-node1 15896 1727203884.37735: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203884.37738: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203884.37756: getting variables 15896 1727203884.37757: in VariableManager get_vars() 15896 1727203884.37812: Calling all_inventory to load vars for managed-node1 15896 1727203884.37815: Calling groups_inventory to load vars for managed-node1 15896 1727203884.37817: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203884.37828: Calling all_plugins_play to load vars for managed-node1 15896 1727203884.37831: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203884.37833: Calling groups_plugins_play to load vars for managed-node1 15896 1727203884.39964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203884.42221: done with get_vars() 15896 1727203884.42248: done getting variables 15896 1727203884.42318: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:51:24 -0400 (0:00:00.069) 0:00:30.012 ***** 15896 1727203884.42357: entering _queue_task() for managed-node1/copy 15896 1727203884.42724: worker is 1 (out of 1 available) 15896 1727203884.42736: exiting _queue_task() for managed-node1/copy 15896 1727203884.42753: done queuing things up, now waiting for results queue to drain 15896 1727203884.42755: waiting for pending results... 15896 1727203884.43092: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203884.43415: in run() - task 028d2410-947f-fb83-b6ad-00000000008a 15896 1727203884.43419: variable 'ansible_search_path' from source: unknown 15896 1727203884.43421: variable 'ansible_search_path' from source: unknown 15896 1727203884.43423: calling self._execute() 15896 1727203884.43426: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203884.43428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203884.43430: variable 'omit' from source: magic vars 15896 1727203884.43793: variable 'ansible_distribution_major_version' from source: facts 15896 1727203884.43809: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203884.43930: variable 'network_provider' from source: set_fact 15896 1727203884.43941: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203884.43948: when evaluation is False, skipping this task 15896 1727203884.43953: _execute() done 15896 1727203884.43964: dumping result to json 15896 1727203884.43972: done dumping result, returning 15896 1727203884.43986: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-fb83-b6ad-00000000008a] 15896 1727203884.43996: sending task result for task 028d2410-947f-fb83-b6ad-00000000008a 15896 1727203884.44100: done sending task result for task 028d2410-947f-fb83-b6ad-00000000008a 15896 1727203884.44107: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203884.44158: no more pending results, returning what we have 15896 1727203884.44161: results queue empty 15896 1727203884.44162: checking for any_errors_fatal 15896 1727203884.44167: done checking for any_errors_fatal 15896 1727203884.44167: checking for max_fail_percentage 15896 1727203884.44169: done checking for max_fail_percentage 15896 1727203884.44170: checking to see if all hosts have failed and the running result is not ok 15896 1727203884.44170: done checking to see if all hosts have failed 15896 1727203884.44171: getting the remaining hosts for this loop 15896 1727203884.44172: done getting the remaining hosts for this loop 15896 1727203884.44177: getting the next task for host managed-node1 15896 1727203884.44183: done getting next task for host managed-node1 15896 1727203884.44187: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203884.44191: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203884.44208: getting variables 15896 1727203884.44209: in VariableManager get_vars() 15896 1727203884.44416: Calling all_inventory to load vars for managed-node1 15896 1727203884.44419: Calling groups_inventory to load vars for managed-node1 15896 1727203884.44421: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203884.44429: Calling all_plugins_play to load vars for managed-node1 15896 1727203884.44432: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203884.44435: Calling groups_plugins_play to load vars for managed-node1 15896 1727203884.46413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203884.48027: done with get_vars() 15896 1727203884.48050: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:51:24 -0400 (0:00:00.057) 0:00:30.070 ***** 15896 1727203884.48136: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203884.48657: worker is 1 (out of 1 available) 15896 1727203884.48671: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203884.48686: done queuing things up, now waiting for results queue to drain 15896 1727203884.48687: waiting for pending results... 15896 1727203884.48994: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203884.49134: in run() - task 028d2410-947f-fb83-b6ad-00000000008b 15896 1727203884.49156: variable 'ansible_search_path' from source: unknown 15896 1727203884.49166: variable 'ansible_search_path' from source: unknown 15896 1727203884.49211: calling self._execute() 15896 1727203884.49312: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203884.49324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203884.49337: variable 'omit' from source: magic vars 15896 1727203884.49712: variable 'ansible_distribution_major_version' from source: facts 15896 1727203884.49743: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203884.49745: variable 'omit' from source: magic vars 15896 1727203884.49809: variable 'omit' from source: magic vars 15896 1727203884.50070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203884.52153: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203884.52228: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203884.52273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203884.52311: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203884.52345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203884.52429: variable 'network_provider' from source: set_fact 15896 1727203884.52573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203884.52622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203884.52655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203884.52706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203884.52723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203884.52796: variable 'omit' from source: magic vars 15896 1727203884.52902: variable 'omit' from source: magic vars 15896 1727203884.53099: variable 'network_connections' from source: task vars 15896 1727203884.53182: variable 'controller_profile' from source: play vars 15896 1727203884.53186: variable 'controller_profile' from source: play vars 15896 1727203884.53338: variable 'omit' from source: magic vars 15896 1727203884.53352: variable '__lsr_ansible_managed' from source: task vars 15896 1727203884.53416: variable '__lsr_ansible_managed' from source: task vars 15896 1727203884.53607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15896 1727203884.53847: Loaded config def from plugin (lookup/template) 15896 1727203884.53864: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15896 1727203884.53900: File lookup term: get_ansible_managed.j2 15896 1727203884.53908: variable 'ansible_search_path' from source: unknown 15896 1727203884.53919: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15896 1727203884.53937: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15896 1727203884.53970: variable 'ansible_search_path' from source: unknown 15896 1727203884.66645: variable 'ansible_managed' from source: unknown 15896 1727203884.66728: variable 'omit' from source: magic vars 15896 1727203884.66768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203884.66801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203884.66822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203884.66844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203884.66865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203884.66890: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203884.66898: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203884.66968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203884.67008: Set connection var ansible_shell_type to sh 15896 1727203884.67028: Set connection var ansible_connection to ssh 15896 1727203884.67041: Set connection var ansible_shell_executable to /bin/sh 15896 1727203884.67053: Set connection var ansible_pipelining to False 15896 1727203884.67069: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203884.67088: Set connection var ansible_timeout to 10 15896 1727203884.67114: variable 'ansible_shell_executable' from source: unknown 15896 1727203884.67121: variable 'ansible_connection' from source: unknown 15896 1727203884.67127: variable 'ansible_module_compression' from source: unknown 15896 1727203884.67132: variable 'ansible_shell_type' from source: unknown 15896 1727203884.67138: variable 'ansible_shell_executable' from source: unknown 15896 1727203884.67144: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203884.67189: variable 'ansible_pipelining' from source: unknown 15896 1727203884.67192: variable 'ansible_timeout' from source: unknown 15896 1727203884.67194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203884.67306: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203884.67327: variable 'omit' from source: magic vars 15896 1727203884.67336: starting attempt loop 15896 1727203884.67382: running the handler 15896 1727203884.67386: _low_level_execute_command(): starting 15896 1727203884.67388: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203884.68067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203884.68088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203884.68184: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203884.68209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203884.68332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203884.70119: stdout chunk (state=3): >>>/root <<< 15896 1727203884.70513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203884.70517: stdout chunk (state=3): >>><<< 15896 1727203884.70519: stderr chunk (state=3): >>><<< 15896 1727203884.70522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203884.70525: _low_level_execute_command(): starting 15896 1727203884.70527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093 `" && echo ansible-tmp-1727203884.7049-18207-30386089576093="` echo /root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093 `" ) && sleep 0' 15896 1727203884.71686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203884.71690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203884.71699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203884.71720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203884.71744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203884.71891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203884.73971: stdout chunk (state=3): >>>ansible-tmp-1727203884.7049-18207-30386089576093=/root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093 <<< 15896 1727203884.74079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203884.74105: stderr chunk (state=3): >>><<< 15896 1727203884.74108: stdout chunk (state=3): >>><<< 15896 1727203884.74128: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203884.7049-18207-30386089576093=/root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203884.74172: variable 'ansible_module_compression' from source: unknown 15896 1727203884.74213: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15896 1727203884.74257: variable 'ansible_facts' from source: unknown 15896 1727203884.74387: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/AnsiballZ_network_connections.py 15896 1727203884.74596: Sending initial data 15896 1727203884.74600: Sent initial data (164 bytes) 15896 1727203884.75104: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203884.75113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203884.75151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203884.75187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203884.75222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203884.75270: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203884.75274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203884.75369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203884.77132: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203884.77221: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203884.77309: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpgl_cg1zp /root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/AnsiballZ_network_connections.py <<< 15896 1727203884.77312: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/AnsiballZ_network_connections.py" <<< 15896 1727203884.77373: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpgl_cg1zp" to remote "/root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/AnsiballZ_network_connections.py" <<< 15896 1727203884.78497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203884.78563: stderr chunk (state=3): >>><<< 15896 1727203884.78566: stdout chunk (state=3): >>><<< 15896 1727203884.78574: done transferring module to remote 15896 1727203884.78590: _low_level_execute_command(): starting 15896 1727203884.78599: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/ /root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/AnsiballZ_network_connections.py && sleep 0' 15896 1727203884.79187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203884.79204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203884.79226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203884.79246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203884.79292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203884.79363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203884.79384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203884.79409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203884.79529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203884.81572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203884.81578: stdout chunk (state=3): >>><<< 15896 1727203884.81581: stderr chunk (state=3): >>><<< 15896 1727203884.81684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203884.81688: _low_level_execute_command(): starting 15896 1727203884.81690: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/AnsiballZ_network_connections.py && sleep 0' 15896 1727203884.82292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203884.82312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203884.82323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203884.82340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203884.82448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203885.27780: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hjg2wgsa/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hjg2wgsa/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/cf994329-c7c7-4568-8772-d142c724631d: error=unknown <<< 15896 1727203885.27989: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15896 1727203885.30222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203885.30280: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 15896 1727203885.30294: stderr chunk (state=3): >>><<< 15896 1727203885.30303: stdout chunk (state=3): >>><<< 15896 1727203885.30331: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hjg2wgsa/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hjg2wgsa/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/cf994329-c7c7-4568-8772-d142c724631d: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203885.30381: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203885.30465: _low_level_execute_command(): starting 15896 1727203885.30468: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203884.7049-18207-30386089576093/ > /dev/null 2>&1 && sleep 0' 15896 1727203885.31038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203885.31041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.31091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.31164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203885.31208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203885.31249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203885.31365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203885.33487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203885.33491: stdout chunk (state=3): >>><<< 15896 1727203885.33493: stderr chunk (state=3): >>><<< 15896 1727203885.33496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203885.33498: handler run complete 15896 1727203885.33681: attempt loop complete, returning result 15896 1727203885.33685: _execute() done 15896 1727203885.33687: dumping result to json 15896 1727203885.33689: done dumping result, returning 15896 1727203885.33812: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-fb83-b6ad-00000000008b] 15896 1727203885.33819: sending task result for task 028d2410-947f-fb83-b6ad-00000000008b 15896 1727203885.34089: done sending task result for task 028d2410-947f-fb83-b6ad-00000000008b 15896 1727203885.34093: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15896 1727203885.34200: no more pending results, returning what we have 15896 1727203885.34204: results queue empty 15896 1727203885.34205: checking for any_errors_fatal 15896 1727203885.34210: done checking for any_errors_fatal 15896 1727203885.34211: checking for max_fail_percentage 15896 1727203885.34213: done checking for max_fail_percentage 15896 1727203885.34214: checking to see if all hosts have failed and the running result is not ok 15896 1727203885.34214: done checking to see if all hosts have failed 15896 1727203885.34215: getting the remaining hosts for this loop 15896 1727203885.34216: done getting the remaining hosts for this loop 15896 1727203885.34220: getting the next task for host managed-node1 15896 1727203885.34226: done getting next task for host managed-node1 15896 1727203885.34230: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203885.34233: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203885.34244: getting variables 15896 1727203885.34246: in VariableManager get_vars() 15896 1727203885.34422: Calling all_inventory to load vars for managed-node1 15896 1727203885.34424: Calling groups_inventory to load vars for managed-node1 15896 1727203885.34427: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203885.34436: Calling all_plugins_play to load vars for managed-node1 15896 1727203885.34438: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203885.34441: Calling groups_plugins_play to load vars for managed-node1 15896 1727203885.35936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203885.38587: done with get_vars() 15896 1727203885.38610: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:51:25 -0400 (0:00:00.905) 0:00:30.976 ***** 15896 1727203885.38692: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203885.39035: worker is 1 (out of 1 available) 15896 1727203885.39049: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203885.39066: done queuing things up, now waiting for results queue to drain 15896 1727203885.39068: waiting for pending results... 15896 1727203885.39332: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203885.39477: in run() - task 028d2410-947f-fb83-b6ad-00000000008c 15896 1727203885.39504: variable 'ansible_search_path' from source: unknown 15896 1727203885.39511: variable 'ansible_search_path' from source: unknown 15896 1727203885.39551: calling self._execute() 15896 1727203885.39658: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.39674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.39694: variable 'omit' from source: magic vars 15896 1727203885.40150: variable 'ansible_distribution_major_version' from source: facts 15896 1727203885.40153: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203885.40211: variable 'network_state' from source: role '' defaults 15896 1727203885.40225: Evaluated conditional (network_state != {}): False 15896 1727203885.40232: when evaluation is False, skipping this task 15896 1727203885.40239: _execute() done 15896 1727203885.40245: dumping result to json 15896 1727203885.40254: done dumping result, returning 15896 1727203885.40270: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-fb83-b6ad-00000000008c] 15896 1727203885.40369: sending task result for task 028d2410-947f-fb83-b6ad-00000000008c 15896 1727203885.40435: done sending task result for task 028d2410-947f-fb83-b6ad-00000000008c 15896 1727203885.40438: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203885.40525: no more pending results, returning what we have 15896 1727203885.40529: results queue empty 15896 1727203885.40530: checking for any_errors_fatal 15896 1727203885.40540: done checking for any_errors_fatal 15896 1727203885.40541: checking for max_fail_percentage 15896 1727203885.40543: done checking for max_fail_percentage 15896 1727203885.40543: checking to see if all hosts have failed and the running result is not ok 15896 1727203885.40544: done checking to see if all hosts have failed 15896 1727203885.40545: getting the remaining hosts for this loop 15896 1727203885.40546: done getting the remaining hosts for this loop 15896 1727203885.40550: getting the next task for host managed-node1 15896 1727203885.40557: done getting next task for host managed-node1 15896 1727203885.40564: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203885.40567: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203885.40788: getting variables 15896 1727203885.40790: in VariableManager get_vars() 15896 1727203885.40832: Calling all_inventory to load vars for managed-node1 15896 1727203885.40834: Calling groups_inventory to load vars for managed-node1 15896 1727203885.40836: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203885.40845: Calling all_plugins_play to load vars for managed-node1 15896 1727203885.40847: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203885.40850: Calling groups_plugins_play to load vars for managed-node1 15896 1727203885.42098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203885.43652: done with get_vars() 15896 1727203885.43684: done getting variables 15896 1727203885.43748: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:51:25 -0400 (0:00:00.050) 0:00:31.027 ***** 15896 1727203885.43788: entering _queue_task() for managed-node1/debug 15896 1727203885.44138: worker is 1 (out of 1 available) 15896 1727203885.44151: exiting _queue_task() for managed-node1/debug 15896 1727203885.44166: done queuing things up, now waiting for results queue to drain 15896 1727203885.44168: waiting for pending results... 15896 1727203885.44474: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203885.44620: in run() - task 028d2410-947f-fb83-b6ad-00000000008d 15896 1727203885.44641: variable 'ansible_search_path' from source: unknown 15896 1727203885.44648: variable 'ansible_search_path' from source: unknown 15896 1727203885.44698: calling self._execute() 15896 1727203885.44809: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.44821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.44837: variable 'omit' from source: magic vars 15896 1727203885.45242: variable 'ansible_distribution_major_version' from source: facts 15896 1727203885.45259: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203885.45347: variable 'omit' from source: magic vars 15896 1727203885.45350: variable 'omit' from source: magic vars 15896 1727203885.45387: variable 'omit' from source: magic vars 15896 1727203885.45433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203885.45483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203885.45508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203885.45530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203885.45547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203885.45591: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203885.45601: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.45610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.45717: Set connection var ansible_shell_type to sh 15896 1727203885.45728: Set connection var ansible_connection to ssh 15896 1727203885.45736: Set connection var ansible_shell_executable to /bin/sh 15896 1727203885.45743: Set connection var ansible_pipelining to False 15896 1727203885.45781: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203885.45784: Set connection var ansible_timeout to 10 15896 1727203885.45787: variable 'ansible_shell_executable' from source: unknown 15896 1727203885.45794: variable 'ansible_connection' from source: unknown 15896 1727203885.45801: variable 'ansible_module_compression' from source: unknown 15896 1727203885.45806: variable 'ansible_shell_type' from source: unknown 15896 1727203885.45811: variable 'ansible_shell_executable' from source: unknown 15896 1727203885.45816: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.45822: variable 'ansible_pipelining' from source: unknown 15896 1727203885.45827: variable 'ansible_timeout' from source: unknown 15896 1727203885.45881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.45970: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203885.45987: variable 'omit' from source: magic vars 15896 1727203885.45999: starting attempt loop 15896 1727203885.46005: running the handler 15896 1727203885.46132: variable '__network_connections_result' from source: set_fact 15896 1727203885.46186: handler run complete 15896 1727203885.46207: attempt loop complete, returning result 15896 1727203885.46218: _execute() done 15896 1727203885.46224: dumping result to json 15896 1727203885.46231: done dumping result, returning 15896 1727203885.46325: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-fb83-b6ad-00000000008d] 15896 1727203885.46328: sending task result for task 028d2410-947f-fb83-b6ad-00000000008d 15896 1727203885.46401: done sending task result for task 028d2410-947f-fb83-b6ad-00000000008d 15896 1727203885.46404: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15896 1727203885.46501: no more pending results, returning what we have 15896 1727203885.46505: results queue empty 15896 1727203885.46506: checking for any_errors_fatal 15896 1727203885.46513: done checking for any_errors_fatal 15896 1727203885.46514: checking for max_fail_percentage 15896 1727203885.46516: done checking for max_fail_percentage 15896 1727203885.46517: checking to see if all hosts have failed and the running result is not ok 15896 1727203885.46517: done checking to see if all hosts have failed 15896 1727203885.46518: getting the remaining hosts for this loop 15896 1727203885.46520: done getting the remaining hosts for this loop 15896 1727203885.46523: getting the next task for host managed-node1 15896 1727203885.46530: done getting next task for host managed-node1 15896 1727203885.46534: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203885.46537: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203885.46550: getting variables 15896 1727203885.46552: in VariableManager get_vars() 15896 1727203885.46612: Calling all_inventory to load vars for managed-node1 15896 1727203885.46614: Calling groups_inventory to load vars for managed-node1 15896 1727203885.46617: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203885.46628: Calling all_plugins_play to load vars for managed-node1 15896 1727203885.46631: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203885.46634: Calling groups_plugins_play to load vars for managed-node1 15896 1727203885.48341: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203885.54884: done with get_vars() 15896 1727203885.54911: done getting variables 15896 1727203885.54964: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:51:25 -0400 (0:00:00.112) 0:00:31.139 ***** 15896 1727203885.54996: entering _queue_task() for managed-node1/debug 15896 1727203885.55359: worker is 1 (out of 1 available) 15896 1727203885.55372: exiting _queue_task() for managed-node1/debug 15896 1727203885.55486: done queuing things up, now waiting for results queue to drain 15896 1727203885.55488: waiting for pending results... 15896 1727203885.55735: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203885.55866: in run() - task 028d2410-947f-fb83-b6ad-00000000008e 15896 1727203885.55882: variable 'ansible_search_path' from source: unknown 15896 1727203885.55887: variable 'ansible_search_path' from source: unknown 15896 1727203885.55915: calling self._execute() 15896 1727203885.56008: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.56013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.56023: variable 'omit' from source: magic vars 15896 1727203885.56319: variable 'ansible_distribution_major_version' from source: facts 15896 1727203885.56328: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203885.56334: variable 'omit' from source: magic vars 15896 1727203885.56382: variable 'omit' from source: magic vars 15896 1727203885.56408: variable 'omit' from source: magic vars 15896 1727203885.56442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203885.56469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203885.56489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203885.56502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203885.56512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203885.56536: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203885.56539: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.56541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.56618: Set connection var ansible_shell_type to sh 15896 1727203885.56623: Set connection var ansible_connection to ssh 15896 1727203885.56628: Set connection var ansible_shell_executable to /bin/sh 15896 1727203885.56633: Set connection var ansible_pipelining to False 15896 1727203885.56639: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203885.56644: Set connection var ansible_timeout to 10 15896 1727203885.56663: variable 'ansible_shell_executable' from source: unknown 15896 1727203885.56668: variable 'ansible_connection' from source: unknown 15896 1727203885.56671: variable 'ansible_module_compression' from source: unknown 15896 1727203885.56674: variable 'ansible_shell_type' from source: unknown 15896 1727203885.56678: variable 'ansible_shell_executable' from source: unknown 15896 1727203885.56680: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.56684: variable 'ansible_pipelining' from source: unknown 15896 1727203885.56687: variable 'ansible_timeout' from source: unknown 15896 1727203885.56689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.56789: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203885.56802: variable 'omit' from source: magic vars 15896 1727203885.56806: starting attempt loop 15896 1727203885.56808: running the handler 15896 1727203885.56845: variable '__network_connections_result' from source: set_fact 15896 1727203885.56901: variable '__network_connections_result' from source: set_fact 15896 1727203885.56978: handler run complete 15896 1727203885.56994: attempt loop complete, returning result 15896 1727203885.56997: _execute() done 15896 1727203885.57000: dumping result to json 15896 1727203885.57004: done dumping result, returning 15896 1727203885.57012: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-fb83-b6ad-00000000008e] 15896 1727203885.57015: sending task result for task 028d2410-947f-fb83-b6ad-00000000008e 15896 1727203885.57107: done sending task result for task 028d2410-947f-fb83-b6ad-00000000008e 15896 1727203885.57110: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15896 1727203885.57218: no more pending results, returning what we have 15896 1727203885.57221: results queue empty 15896 1727203885.57221: checking for any_errors_fatal 15896 1727203885.57228: done checking for any_errors_fatal 15896 1727203885.57228: checking for max_fail_percentage 15896 1727203885.57230: done checking for max_fail_percentage 15896 1727203885.57230: checking to see if all hosts have failed and the running result is not ok 15896 1727203885.57231: done checking to see if all hosts have failed 15896 1727203885.57232: getting the remaining hosts for this loop 15896 1727203885.57233: done getting the remaining hosts for this loop 15896 1727203885.57236: getting the next task for host managed-node1 15896 1727203885.57240: done getting next task for host managed-node1 15896 1727203885.57246: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203885.57248: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203885.57258: getting variables 15896 1727203885.57259: in VariableManager get_vars() 15896 1727203885.57303: Calling all_inventory to load vars for managed-node1 15896 1727203885.57305: Calling groups_inventory to load vars for managed-node1 15896 1727203885.57307: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203885.57315: Calling all_plugins_play to load vars for managed-node1 15896 1727203885.57317: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203885.57319: Calling groups_plugins_play to load vars for managed-node1 15896 1727203885.58367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203885.59484: done with get_vars() 15896 1727203885.59500: done getting variables 15896 1727203885.59544: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:51:25 -0400 (0:00:00.045) 0:00:31.185 ***** 15896 1727203885.59573: entering _queue_task() for managed-node1/debug 15896 1727203885.59812: worker is 1 (out of 1 available) 15896 1727203885.59827: exiting _queue_task() for managed-node1/debug 15896 1727203885.59839: done queuing things up, now waiting for results queue to drain 15896 1727203885.59841: waiting for pending results... 15896 1727203885.60025: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203885.60123: in run() - task 028d2410-947f-fb83-b6ad-00000000008f 15896 1727203885.60135: variable 'ansible_search_path' from source: unknown 15896 1727203885.60138: variable 'ansible_search_path' from source: unknown 15896 1727203885.60168: calling self._execute() 15896 1727203885.60243: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.60247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.60257: variable 'omit' from source: magic vars 15896 1727203885.60542: variable 'ansible_distribution_major_version' from source: facts 15896 1727203885.60551: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203885.60636: variable 'network_state' from source: role '' defaults 15896 1727203885.60660: Evaluated conditional (network_state != {}): False 15896 1727203885.60663: when evaluation is False, skipping this task 15896 1727203885.60666: _execute() done 15896 1727203885.60669: dumping result to json 15896 1727203885.60671: done dumping result, returning 15896 1727203885.60674: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-fb83-b6ad-00000000008f] 15896 1727203885.60699: sending task result for task 028d2410-947f-fb83-b6ad-00000000008f 15896 1727203885.60772: done sending task result for task 028d2410-947f-fb83-b6ad-00000000008f 15896 1727203885.60774: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 15896 1727203885.60818: no more pending results, returning what we have 15896 1727203885.60822: results queue empty 15896 1727203885.60822: checking for any_errors_fatal 15896 1727203885.60830: done checking for any_errors_fatal 15896 1727203885.60831: checking for max_fail_percentage 15896 1727203885.60833: done checking for max_fail_percentage 15896 1727203885.60834: checking to see if all hosts have failed and the running result is not ok 15896 1727203885.60834: done checking to see if all hosts have failed 15896 1727203885.60835: getting the remaining hosts for this loop 15896 1727203885.60836: done getting the remaining hosts for this loop 15896 1727203885.60840: getting the next task for host managed-node1 15896 1727203885.60845: done getting next task for host managed-node1 15896 1727203885.60849: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203885.60851: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203885.60869: getting variables 15896 1727203885.60871: in VariableManager get_vars() 15896 1727203885.60918: Calling all_inventory to load vars for managed-node1 15896 1727203885.60921: Calling groups_inventory to load vars for managed-node1 15896 1727203885.60923: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203885.60932: Calling all_plugins_play to load vars for managed-node1 15896 1727203885.60934: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203885.60936: Calling groups_plugins_play to load vars for managed-node1 15896 1727203885.62408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203885.63343: done with get_vars() 15896 1727203885.63359: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:51:25 -0400 (0:00:00.038) 0:00:31.223 ***** 15896 1727203885.63434: entering _queue_task() for managed-node1/ping 15896 1727203885.63686: worker is 1 (out of 1 available) 15896 1727203885.63698: exiting _queue_task() for managed-node1/ping 15896 1727203885.63710: done queuing things up, now waiting for results queue to drain 15896 1727203885.63711: waiting for pending results... 15896 1727203885.63904: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203885.64002: in run() - task 028d2410-947f-fb83-b6ad-000000000090 15896 1727203885.64013: variable 'ansible_search_path' from source: unknown 15896 1727203885.64017: variable 'ansible_search_path' from source: unknown 15896 1727203885.64049: calling self._execute() 15896 1727203885.64126: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.64129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.64139: variable 'omit' from source: magic vars 15896 1727203885.64495: variable 'ansible_distribution_major_version' from source: facts 15896 1727203885.64509: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203885.64583: variable 'omit' from source: magic vars 15896 1727203885.64587: variable 'omit' from source: magic vars 15896 1727203885.64607: variable 'omit' from source: magic vars 15896 1727203885.64645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203885.64786: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203885.64789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203885.64792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203885.64794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203885.64797: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203885.64800: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.64802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.64861: Set connection var ansible_shell_type to sh 15896 1727203885.64872: Set connection var ansible_connection to ssh 15896 1727203885.64878: Set connection var ansible_shell_executable to /bin/sh 15896 1727203885.64883: Set connection var ansible_pipelining to False 15896 1727203885.64893: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203885.64895: Set connection var ansible_timeout to 10 15896 1727203885.64916: variable 'ansible_shell_executable' from source: unknown 15896 1727203885.64919: variable 'ansible_connection' from source: unknown 15896 1727203885.64922: variable 'ansible_module_compression' from source: unknown 15896 1727203885.64925: variable 'ansible_shell_type' from source: unknown 15896 1727203885.64927: variable 'ansible_shell_executable' from source: unknown 15896 1727203885.64929: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203885.64931: variable 'ansible_pipelining' from source: unknown 15896 1727203885.64933: variable 'ansible_timeout' from source: unknown 15896 1727203885.64939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203885.65140: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203885.65150: variable 'omit' from source: magic vars 15896 1727203885.65156: starting attempt loop 15896 1727203885.65159: running the handler 15896 1727203885.65179: _low_level_execute_command(): starting 15896 1727203885.65187: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203885.65983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203885.65988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203885.65996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203885.66002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203885.66005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203885.66007: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203885.66012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.66015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203885.66018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203885.66020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203885.66042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203885.66085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203885.66089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.66151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203885.66160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203885.66243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203885.68023: stdout chunk (state=3): >>>/root <<< 15896 1727203885.68155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203885.68159: stdout chunk (state=3): >>><<< 15896 1727203885.68162: stderr chunk (state=3): >>><<< 15896 1727203885.68183: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203885.68194: _low_level_execute_command(): starting 15896 1727203885.68199: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393 `" && echo ansible-tmp-1727203885.6817908-18252-15024028468393="` echo /root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393 `" ) && sleep 0' 15896 1727203885.68709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.68738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203885.68741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203885.68784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203885.68859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203885.70979: stdout chunk (state=3): >>>ansible-tmp-1727203885.6817908-18252-15024028468393=/root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393 <<< 15896 1727203885.71081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203885.71119: stderr chunk (state=3): >>><<< 15896 1727203885.71122: stdout chunk (state=3): >>><<< 15896 1727203885.71138: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203885.6817908-18252-15024028468393=/root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203885.71179: variable 'ansible_module_compression' from source: unknown 15896 1727203885.71213: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15896 1727203885.71242: variable 'ansible_facts' from source: unknown 15896 1727203885.71296: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/AnsiballZ_ping.py 15896 1727203885.71400: Sending initial data 15896 1727203885.71404: Sent initial data (152 bytes) 15896 1727203885.71862: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203885.71977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203885.72084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203885.72296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203885.74111: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203885.74199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203885.74281: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp2wk64fek /root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/AnsiballZ_ping.py <<< 15896 1727203885.74285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/AnsiballZ_ping.py" <<< 15896 1727203885.74349: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp2wk64fek" to remote "/root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/AnsiballZ_ping.py" <<< 15896 1727203885.75696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203885.75754: stderr chunk (state=3): >>><<< 15896 1727203885.75769: stdout chunk (state=3): >>><<< 15896 1727203885.75985: done transferring module to remote 15896 1727203885.75989: _low_level_execute_command(): starting 15896 1727203885.75993: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/ /root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/AnsiballZ_ping.py && sleep 0' 15896 1727203885.77150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203885.77153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203885.77156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15896 1727203885.77158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203885.77160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.77491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203885.77595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203885.79736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203885.79751: stdout chunk (state=3): >>><<< 15896 1727203885.79763: stderr chunk (state=3): >>><<< 15896 1727203885.79797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203885.79805: _low_level_execute_command(): starting 15896 1727203885.79881: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/AnsiballZ_ping.py && sleep 0' 15896 1727203885.80434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203885.80447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203885.80460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203885.80489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.80505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203885.80542: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.80607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203885.80638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203885.80764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203885.97042: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15896 1727203885.98713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203885.98719: stdout chunk (state=3): >>><<< 15896 1727203885.98722: stderr chunk (state=3): >>><<< 15896 1727203885.98753: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203885.98784: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203885.98794: _low_level_execute_command(): starting 15896 1727203885.98806: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203885.6817908-18252-15024028468393/ > /dev/null 2>&1 && sleep 0' 15896 1727203885.99253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203885.99256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203885.99259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.99264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203885.99266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203885.99316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203885.99322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203885.99401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.01455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.01459: stderr chunk (state=3): >>><<< 15896 1727203886.01461: stdout chunk (state=3): >>><<< 15896 1727203886.01464: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203886.01474: handler run complete 15896 1727203886.01545: attempt loop complete, returning result 15896 1727203886.01548: _execute() done 15896 1727203886.01550: dumping result to json 15896 1727203886.01656: done dumping result, returning 15896 1727203886.01658: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-fb83-b6ad-000000000090] 15896 1727203886.01660: sending task result for task 028d2410-947f-fb83-b6ad-000000000090 15896 1727203886.01729: done sending task result for task 028d2410-947f-fb83-b6ad-000000000090 15896 1727203886.01732: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 15896 1727203886.01824: no more pending results, returning what we have 15896 1727203886.01829: results queue empty 15896 1727203886.01830: checking for any_errors_fatal 15896 1727203886.01836: done checking for any_errors_fatal 15896 1727203886.01837: checking for max_fail_percentage 15896 1727203886.01839: done checking for max_fail_percentage 15896 1727203886.01840: checking to see if all hosts have failed and the running result is not ok 15896 1727203886.01841: done checking to see if all hosts have failed 15896 1727203886.01842: getting the remaining hosts for this loop 15896 1727203886.01843: done getting the remaining hosts for this loop 15896 1727203886.01847: getting the next task for host managed-node1 15896 1727203886.01857: done getting next task for host managed-node1 15896 1727203886.01860: ^ task is: TASK: meta (role_complete) 15896 1727203886.01863: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203886.01879: getting variables 15896 1727203886.01881: in VariableManager get_vars() 15896 1727203886.01941: Calling all_inventory to load vars for managed-node1 15896 1727203886.01944: Calling groups_inventory to load vars for managed-node1 15896 1727203886.01946: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203886.01957: Calling all_plugins_play to load vars for managed-node1 15896 1727203886.01960: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203886.01963: Calling groups_plugins_play to load vars for managed-node1 15896 1727203886.03235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203886.05177: done with get_vars() 15896 1727203886.05203: done getting variables 15896 1727203886.05287: done queuing things up, now waiting for results queue to drain 15896 1727203886.05289: results queue empty 15896 1727203886.05289: checking for any_errors_fatal 15896 1727203886.05292: done checking for any_errors_fatal 15896 1727203886.05293: checking for max_fail_percentage 15896 1727203886.05294: done checking for max_fail_percentage 15896 1727203886.05295: checking to see if all hosts have failed and the running result is not ok 15896 1727203886.05295: done checking to see if all hosts have failed 15896 1727203886.05296: getting the remaining hosts for this loop 15896 1727203886.05297: done getting the remaining hosts for this loop 15896 1727203886.05300: getting the next task for host managed-node1 15896 1727203886.05304: done getting next task for host managed-node1 15896 1727203886.05305: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 15896 1727203886.05307: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203886.05309: getting variables 15896 1727203886.05310: in VariableManager get_vars() 15896 1727203886.05332: Calling all_inventory to load vars for managed-node1 15896 1727203886.05334: Calling groups_inventory to load vars for managed-node1 15896 1727203886.05336: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203886.05341: Calling all_plugins_play to load vars for managed-node1 15896 1727203886.05343: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203886.05346: Calling groups_plugins_play to load vars for managed-node1 15896 1727203886.06707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203886.08264: done with get_vars() 15896 1727203886.08287: done getting variables 15896 1727203886.08328: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203886.08449: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Tuesday 24 September 2024 14:51:26 -0400 (0:00:00.450) 0:00:31.674 ***** 15896 1727203886.08476: entering _queue_task() for managed-node1/command 15896 1727203886.08984: worker is 1 (out of 1 available) 15896 1727203886.08998: exiting _queue_task() for managed-node1/command 15896 1727203886.09010: done queuing things up, now waiting for results queue to drain 15896 1727203886.09012: waiting for pending results... 15896 1727203886.09394: running TaskExecutor() for managed-node1/TASK: From the active connection, get the port1 profile "bond0.0" 15896 1727203886.09399: in run() - task 028d2410-947f-fb83-b6ad-0000000000c0 15896 1727203886.09407: variable 'ansible_search_path' from source: unknown 15896 1727203886.09450: calling self._execute() 15896 1727203886.09571: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.09781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.09786: variable 'omit' from source: magic vars 15896 1727203886.09998: variable 'ansible_distribution_major_version' from source: facts 15896 1727203886.10021: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203886.10139: variable 'network_provider' from source: set_fact 15896 1727203886.10152: Evaluated conditional (network_provider == "nm"): True 15896 1727203886.10163: variable 'omit' from source: magic vars 15896 1727203886.10194: variable 'omit' from source: magic vars 15896 1727203886.10291: variable 'port1_profile' from source: play vars 15896 1727203886.10314: variable 'omit' from source: magic vars 15896 1727203886.10367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203886.10410: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203886.10439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203886.10464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203886.10492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203886.10528: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203886.10537: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.10544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.10658: Set connection var ansible_shell_type to sh 15896 1727203886.10678: Set connection var ansible_connection to ssh 15896 1727203886.10689: Set connection var ansible_shell_executable to /bin/sh 15896 1727203886.10699: Set connection var ansible_pipelining to False 15896 1727203886.10709: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203886.10719: Set connection var ansible_timeout to 10 15896 1727203886.10746: variable 'ansible_shell_executable' from source: unknown 15896 1727203886.10754: variable 'ansible_connection' from source: unknown 15896 1727203886.10762: variable 'ansible_module_compression' from source: unknown 15896 1727203886.10774: variable 'ansible_shell_type' from source: unknown 15896 1727203886.10881: variable 'ansible_shell_executable' from source: unknown 15896 1727203886.10884: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.10887: variable 'ansible_pipelining' from source: unknown 15896 1727203886.10889: variable 'ansible_timeout' from source: unknown 15896 1727203886.10891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.10946: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203886.10964: variable 'omit' from source: magic vars 15896 1727203886.10977: starting attempt loop 15896 1727203886.10989: running the handler 15896 1727203886.11012: _low_level_execute_command(): starting 15896 1727203886.11025: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203886.11833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203886.11864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.11987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.13803: stdout chunk (state=3): >>>/root <<< 15896 1727203886.13889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.13950: stderr chunk (state=3): >>><<< 15896 1727203886.13979: stdout chunk (state=3): >>><<< 15896 1727203886.14097: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203886.14100: _low_level_execute_command(): starting 15896 1727203886.14104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101 `" && echo ansible-tmp-1727203886.1400018-18283-43013709845101="` echo /root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101 `" ) && sleep 0' 15896 1727203886.14635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203886.14646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203886.14659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203886.14677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203886.14764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203886.14798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.14909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.17174: stdout chunk (state=3): >>>ansible-tmp-1727203886.1400018-18283-43013709845101=/root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101 <<< 15896 1727203886.17205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.17208: stdout chunk (state=3): >>><<< 15896 1727203886.17211: stderr chunk (state=3): >>><<< 15896 1727203886.17381: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203886.1400018-18283-43013709845101=/root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203886.17385: variable 'ansible_module_compression' from source: unknown 15896 1727203886.17388: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203886.17390: variable 'ansible_facts' from source: unknown 15896 1727203886.17459: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/AnsiballZ_command.py 15896 1727203886.17631: Sending initial data 15896 1727203886.17640: Sent initial data (155 bytes) 15896 1727203886.18259: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203886.18369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203886.18395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203886.18412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.18523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.20269: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203886.20362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203886.20455: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp8o2mkqjh /root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/AnsiballZ_command.py <<< 15896 1727203886.20485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/AnsiballZ_command.py" <<< 15896 1727203886.20548: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp8o2mkqjh" to remote "/root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/AnsiballZ_command.py" <<< 15896 1727203886.21581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.21585: stdout chunk (state=3): >>><<< 15896 1727203886.21587: stderr chunk (state=3): >>><<< 15896 1727203886.21589: done transferring module to remote 15896 1727203886.21591: _low_level_execute_command(): starting 15896 1727203886.21594: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/ /root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/AnsiballZ_command.py && sleep 0' 15896 1727203886.22268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203886.22282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.22357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203886.22393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.22514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.24467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.24519: stderr chunk (state=3): >>><<< 15896 1727203886.24539: stdout chunk (state=3): >>><<< 15896 1727203886.24639: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203886.24647: _low_level_execute_command(): starting 15896 1727203886.24650: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/AnsiballZ_command.py && sleep 0' 15896 1727203886.25184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203886.25200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203886.25223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203886.25241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203886.25264: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203886.25330: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203886.25333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.25396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.25655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.43734: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-24 14:51:26.417827", "end": "2024-09-24 14:51:26.435632", "delta": "0:00:00.017805", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203886.45792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203886.45796: stdout chunk (state=3): >>><<< 15896 1727203886.45944: stderr chunk (state=3): >>><<< 15896 1727203886.45948: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-24 14:51:26.417827", "end": "2024-09-24 14:51:26.435632", "delta": "0:00:00.017805", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203886.45951: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203886.45953: _low_level_execute_command(): starting 15896 1727203886.45955: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203886.1400018-18283-43013709845101/ > /dev/null 2>&1 && sleep 0' 15896 1727203886.46550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203886.46563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203886.46584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.46708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.48742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.48779: stdout chunk (state=3): >>><<< 15896 1727203886.48807: stderr chunk (state=3): >>><<< 15896 1727203886.48828: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203886.48843: handler run complete 15896 1727203886.48872: Evaluated conditional (False): False 15896 1727203886.48889: attempt loop complete, returning result 15896 1727203886.48934: _execute() done 15896 1727203886.48942: dumping result to json 15896 1727203886.48963: done dumping result, returning 15896 1727203886.48986: done running TaskExecutor() for managed-node1/TASK: From the active connection, get the port1 profile "bond0.0" [028d2410-947f-fb83-b6ad-0000000000c0] 15896 1727203886.48996: sending task result for task 028d2410-947f-fb83-b6ad-0000000000c0 ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.0" ], "delta": "0:00:00.017805", "end": "2024-09-24 14:51:26.435632", "rc": 0, "start": "2024-09-24 14:51:26.417827" } 15896 1727203886.49301: no more pending results, returning what we have 15896 1727203886.49305: results queue empty 15896 1727203886.49306: checking for any_errors_fatal 15896 1727203886.49308: done checking for any_errors_fatal 15896 1727203886.49309: checking for max_fail_percentage 15896 1727203886.49310: done checking for max_fail_percentage 15896 1727203886.49311: checking to see if all hosts have failed and the running result is not ok 15896 1727203886.49312: done checking to see if all hosts have failed 15896 1727203886.49312: getting the remaining hosts for this loop 15896 1727203886.49314: done getting the remaining hosts for this loop 15896 1727203886.49317: getting the next task for host managed-node1 15896 1727203886.49322: done getting next task for host managed-node1 15896 1727203886.49324: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 15896 1727203886.49326: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203886.49331: getting variables 15896 1727203886.49332: in VariableManager get_vars() 15896 1727203886.49495: Calling all_inventory to load vars for managed-node1 15896 1727203886.49498: Calling groups_inventory to load vars for managed-node1 15896 1727203886.49500: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203886.49514: Calling all_plugins_play to load vars for managed-node1 15896 1727203886.49517: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203886.49520: Calling groups_plugins_play to load vars for managed-node1 15896 1727203886.50043: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000c0 15896 1727203886.50047: WORKER PROCESS EXITING 15896 1727203886.50536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203886.52016: done with get_vars() 15896 1727203886.52094: done getting variables 15896 1727203886.52171: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203886.52313: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Tuesday 24 September 2024 14:51:26 -0400 (0:00:00.438) 0:00:32.112 ***** 15896 1727203886.52342: entering _queue_task() for managed-node1/command 15896 1727203886.52830: worker is 1 (out of 1 available) 15896 1727203886.52843: exiting _queue_task() for managed-node1/command 15896 1727203886.52855: done queuing things up, now waiting for results queue to drain 15896 1727203886.52856: waiting for pending results... 15896 1727203886.53064: running TaskExecutor() for managed-node1/TASK: From the active connection, get the port2 profile "bond0.1" 15896 1727203886.53134: in run() - task 028d2410-947f-fb83-b6ad-0000000000c1 15896 1727203886.53147: variable 'ansible_search_path' from source: unknown 15896 1727203886.53180: calling self._execute() 15896 1727203886.53260: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.53267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.53277: variable 'omit' from source: magic vars 15896 1727203886.53557: variable 'ansible_distribution_major_version' from source: facts 15896 1727203886.53569: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203886.53650: variable 'network_provider' from source: set_fact 15896 1727203886.53654: Evaluated conditional (network_provider == "nm"): True 15896 1727203886.53660: variable 'omit' from source: magic vars 15896 1727203886.53681: variable 'omit' from source: magic vars 15896 1727203886.53746: variable 'port2_profile' from source: play vars 15896 1727203886.53759: variable 'omit' from source: magic vars 15896 1727203886.53795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203886.53820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203886.53838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203886.53854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203886.53866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203886.53891: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203886.53894: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.53897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.53985: Set connection var ansible_shell_type to sh 15896 1727203886.53992: Set connection var ansible_connection to ssh 15896 1727203886.53998: Set connection var ansible_shell_executable to /bin/sh 15896 1727203886.54003: Set connection var ansible_pipelining to False 15896 1727203886.54008: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203886.54013: Set connection var ansible_timeout to 10 15896 1727203886.54030: variable 'ansible_shell_executable' from source: unknown 15896 1727203886.54033: variable 'ansible_connection' from source: unknown 15896 1727203886.54038: variable 'ansible_module_compression' from source: unknown 15896 1727203886.54040: variable 'ansible_shell_type' from source: unknown 15896 1727203886.54050: variable 'ansible_shell_executable' from source: unknown 15896 1727203886.54053: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.54082: variable 'ansible_pipelining' from source: unknown 15896 1727203886.54085: variable 'ansible_timeout' from source: unknown 15896 1727203886.54088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.54381: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203886.54386: variable 'omit' from source: magic vars 15896 1727203886.54388: starting attempt loop 15896 1727203886.54390: running the handler 15896 1727203886.54392: _low_level_execute_command(): starting 15896 1727203886.54394: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203886.54946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203886.54957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203886.54973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203886.54991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203886.55005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203886.55012: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203886.55022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.55037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203886.55046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203886.55054: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203886.55067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203886.55077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203886.55089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203886.55097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203886.55105: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203886.55115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.55183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203886.55210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.55314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.57118: stdout chunk (state=3): >>>/root <<< 15896 1727203886.57246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.57253: stderr chunk (state=3): >>><<< 15896 1727203886.57256: stdout chunk (state=3): >>><<< 15896 1727203886.57292: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203886.57302: _low_level_execute_command(): starting 15896 1727203886.57316: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954 `" && echo ansible-tmp-1727203886.5728724-18305-178210176216954="` echo /root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954 `" ) && sleep 0' 15896 1727203886.57896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.57954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203886.57969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203886.58002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.58114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.60212: stdout chunk (state=3): >>>ansible-tmp-1727203886.5728724-18305-178210176216954=/root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954 <<< 15896 1727203886.60330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.60346: stderr chunk (state=3): >>><<< 15896 1727203886.60349: stdout chunk (state=3): >>><<< 15896 1727203886.60367: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203886.5728724-18305-178210176216954=/root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203886.60395: variable 'ansible_module_compression' from source: unknown 15896 1727203886.60437: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203886.60468: variable 'ansible_facts' from source: unknown 15896 1727203886.60522: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/AnsiballZ_command.py 15896 1727203886.60622: Sending initial data 15896 1727203886.60625: Sent initial data (156 bytes) 15896 1727203886.61049: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203886.61054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203886.61056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15896 1727203886.61059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203886.61064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.61110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203886.61114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.61203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.62935: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15896 1727203886.62940: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203886.63005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203886.63083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp5shpyegd /root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/AnsiballZ_command.py <<< 15896 1727203886.63088: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/AnsiballZ_command.py" <<< 15896 1727203886.63154: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp5shpyegd" to remote "/root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/AnsiballZ_command.py" <<< 15896 1727203886.63815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.63854: stderr chunk (state=3): >>><<< 15896 1727203886.63857: stdout chunk (state=3): >>><<< 15896 1727203886.63902: done transferring module to remote 15896 1727203886.63910: _low_level_execute_command(): starting 15896 1727203886.63915: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/ /root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/AnsiballZ_command.py && sleep 0' 15896 1727203886.64360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203886.64364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.64370: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203886.64372: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203886.64374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.64422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203886.64427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203886.64430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.64504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.66489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.66516: stderr chunk (state=3): >>><<< 15896 1727203886.66519: stdout chunk (state=3): >>><<< 15896 1727203886.66532: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203886.66535: _low_level_execute_command(): starting 15896 1727203886.66541: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/AnsiballZ_command.py && sleep 0' 15896 1727203886.66949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203886.66956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203886.66981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.66985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203886.66994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.67045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203886.67048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203886.67055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.67136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.85129: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-24 14:51:26.831833", "end": "2024-09-24 14:51:26.849696", "delta": "0:00:00.017863", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203886.86868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203886.86898: stderr chunk (state=3): >>><<< 15896 1727203886.86901: stdout chunk (state=3): >>><<< 15896 1727203886.86924: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-24 14:51:26.831833", "end": "2024-09-24 14:51:26.849696", "delta": "0:00:00.017863", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203886.86953: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203886.86963: _low_level_execute_command(): starting 15896 1727203886.86966: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203886.5728724-18305-178210176216954/ > /dev/null 2>&1 && sleep 0' 15896 1727203886.87427: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203886.87431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.87433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203886.87435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203886.87437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203886.87481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203886.87496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203886.87587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203886.89573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203886.89579: stdout chunk (state=3): >>><<< 15896 1727203886.89584: stderr chunk (state=3): >>><<< 15896 1727203886.89599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203886.89610: handler run complete 15896 1727203886.89626: Evaluated conditional (False): False 15896 1727203886.89634: attempt loop complete, returning result 15896 1727203886.89637: _execute() done 15896 1727203886.89640: dumping result to json 15896 1727203886.89645: done dumping result, returning 15896 1727203886.89653: done running TaskExecutor() for managed-node1/TASK: From the active connection, get the port2 profile "bond0.1" [028d2410-947f-fb83-b6ad-0000000000c1] 15896 1727203886.89655: sending task result for task 028d2410-947f-fb83-b6ad-0000000000c1 15896 1727203886.89752: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000c1 15896 1727203886.89755: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.1" ], "delta": "0:00:00.017863", "end": "2024-09-24 14:51:26.849696", "rc": 0, "start": "2024-09-24 14:51:26.831833" } 15896 1727203886.89834: no more pending results, returning what we have 15896 1727203886.89838: results queue empty 15896 1727203886.89839: checking for any_errors_fatal 15896 1727203886.89850: done checking for any_errors_fatal 15896 1727203886.89850: checking for max_fail_percentage 15896 1727203886.89854: done checking for max_fail_percentage 15896 1727203886.89854: checking to see if all hosts have failed and the running result is not ok 15896 1727203886.89855: done checking to see if all hosts have failed 15896 1727203886.89856: getting the remaining hosts for this loop 15896 1727203886.89858: done getting the remaining hosts for this loop 15896 1727203886.89861: getting the next task for host managed-node1 15896 1727203886.89868: done getting next task for host managed-node1 15896 1727203886.89871: ^ task is: TASK: Assert that the port1 profile is not activated 15896 1727203886.89873: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203886.89878: getting variables 15896 1727203886.89879: in VariableManager get_vars() 15896 1727203886.89935: Calling all_inventory to load vars for managed-node1 15896 1727203886.89937: Calling groups_inventory to load vars for managed-node1 15896 1727203886.89939: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203886.89950: Calling all_plugins_play to load vars for managed-node1 15896 1727203886.89953: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203886.89955: Calling groups_plugins_play to load vars for managed-node1 15896 1727203886.90744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203886.91600: done with get_vars() 15896 1727203886.91615: done getting variables 15896 1727203886.91660: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Tuesday 24 September 2024 14:51:26 -0400 (0:00:00.393) 0:00:32.506 ***** 15896 1727203886.91684: entering _queue_task() for managed-node1/assert 15896 1727203886.91920: worker is 1 (out of 1 available) 15896 1727203886.91933: exiting _queue_task() for managed-node1/assert 15896 1727203886.91946: done queuing things up, now waiting for results queue to drain 15896 1727203886.91947: waiting for pending results... 15896 1727203886.92133: running TaskExecutor() for managed-node1/TASK: Assert that the port1 profile is not activated 15896 1727203886.92200: in run() - task 028d2410-947f-fb83-b6ad-0000000000c2 15896 1727203886.92211: variable 'ansible_search_path' from source: unknown 15896 1727203886.92239: calling self._execute() 15896 1727203886.92325: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.92329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.92340: variable 'omit' from source: magic vars 15896 1727203886.92618: variable 'ansible_distribution_major_version' from source: facts 15896 1727203886.92627: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203886.92706: variable 'network_provider' from source: set_fact 15896 1727203886.92711: Evaluated conditional (network_provider == "nm"): True 15896 1727203886.92724: variable 'omit' from source: magic vars 15896 1727203886.92736: variable 'omit' from source: magic vars 15896 1727203886.92806: variable 'port1_profile' from source: play vars 15896 1727203886.92823: variable 'omit' from source: magic vars 15896 1727203886.92856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203886.92885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203886.92902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203886.92915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203886.92925: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203886.92952: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203886.92955: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.92958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.93029: Set connection var ansible_shell_type to sh 15896 1727203886.93034: Set connection var ansible_connection to ssh 15896 1727203886.93040: Set connection var ansible_shell_executable to /bin/sh 15896 1727203886.93044: Set connection var ansible_pipelining to False 15896 1727203886.93055: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203886.93057: Set connection var ansible_timeout to 10 15896 1727203886.93077: variable 'ansible_shell_executable' from source: unknown 15896 1727203886.93080: variable 'ansible_connection' from source: unknown 15896 1727203886.93082: variable 'ansible_module_compression' from source: unknown 15896 1727203886.93084: variable 'ansible_shell_type' from source: unknown 15896 1727203886.93086: variable 'ansible_shell_executable' from source: unknown 15896 1727203886.93089: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.93094: variable 'ansible_pipelining' from source: unknown 15896 1727203886.93096: variable 'ansible_timeout' from source: unknown 15896 1727203886.93100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.93203: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203886.93212: variable 'omit' from source: magic vars 15896 1727203886.93217: starting attempt loop 15896 1727203886.93221: running the handler 15896 1727203886.93334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203886.94791: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203886.94838: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203886.94865: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203886.94982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203886.94985: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203886.94987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203886.94990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203886.95002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203886.95029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203886.95040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203886.95120: variable 'active_port1_profile' from source: set_fact 15896 1727203886.95136: Evaluated conditional (active_port1_profile.stdout | length == 0): True 15896 1727203886.95141: handler run complete 15896 1727203886.95152: attempt loop complete, returning result 15896 1727203886.95155: _execute() done 15896 1727203886.95158: dumping result to json 15896 1727203886.95160: done dumping result, returning 15896 1727203886.95170: done running TaskExecutor() for managed-node1/TASK: Assert that the port1 profile is not activated [028d2410-947f-fb83-b6ad-0000000000c2] 15896 1727203886.95173: sending task result for task 028d2410-947f-fb83-b6ad-0000000000c2 15896 1727203886.95255: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000c2 15896 1727203886.95257: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203886.95306: no more pending results, returning what we have 15896 1727203886.95310: results queue empty 15896 1727203886.95310: checking for any_errors_fatal 15896 1727203886.95321: done checking for any_errors_fatal 15896 1727203886.95322: checking for max_fail_percentage 15896 1727203886.95323: done checking for max_fail_percentage 15896 1727203886.95324: checking to see if all hosts have failed and the running result is not ok 15896 1727203886.95325: done checking to see if all hosts have failed 15896 1727203886.95326: getting the remaining hosts for this loop 15896 1727203886.95327: done getting the remaining hosts for this loop 15896 1727203886.95330: getting the next task for host managed-node1 15896 1727203886.95335: done getting next task for host managed-node1 15896 1727203886.95337: ^ task is: TASK: Assert that the port2 profile is not activated 15896 1727203886.95339: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203886.95343: getting variables 15896 1727203886.95350: in VariableManager get_vars() 15896 1727203886.95409: Calling all_inventory to load vars for managed-node1 15896 1727203886.95412: Calling groups_inventory to load vars for managed-node1 15896 1727203886.95415: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203886.95425: Calling all_plugins_play to load vars for managed-node1 15896 1727203886.95427: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203886.95429: Calling groups_plugins_play to load vars for managed-node1 15896 1727203886.96357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203886.97203: done with get_vars() 15896 1727203886.97219: done getting variables 15896 1727203886.97261: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Tuesday 24 September 2024 14:51:26 -0400 (0:00:00.055) 0:00:32.562 ***** 15896 1727203886.97283: entering _queue_task() for managed-node1/assert 15896 1727203886.97522: worker is 1 (out of 1 available) 15896 1727203886.97534: exiting _queue_task() for managed-node1/assert 15896 1727203886.97546: done queuing things up, now waiting for results queue to drain 15896 1727203886.97547: waiting for pending results... 15896 1727203886.97738: running TaskExecutor() for managed-node1/TASK: Assert that the port2 profile is not activated 15896 1727203886.97810: in run() - task 028d2410-947f-fb83-b6ad-0000000000c3 15896 1727203886.97823: variable 'ansible_search_path' from source: unknown 15896 1727203886.97851: calling self._execute() 15896 1727203886.97943: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.97947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.97956: variable 'omit' from source: magic vars 15896 1727203886.98239: variable 'ansible_distribution_major_version' from source: facts 15896 1727203886.98248: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203886.98330: variable 'network_provider' from source: set_fact 15896 1727203886.98336: Evaluated conditional (network_provider == "nm"): True 15896 1727203886.98342: variable 'omit' from source: magic vars 15896 1727203886.98358: variable 'omit' from source: magic vars 15896 1727203886.98430: variable 'port2_profile' from source: play vars 15896 1727203886.98442: variable 'omit' from source: magic vars 15896 1727203886.98476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203886.98502: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203886.98518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203886.98536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203886.98545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203886.98566: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203886.98569: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.98571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.98640: Set connection var ansible_shell_type to sh 15896 1727203886.98643: Set connection var ansible_connection to ssh 15896 1727203886.98655: Set connection var ansible_shell_executable to /bin/sh 15896 1727203886.98658: Set connection var ansible_pipelining to False 15896 1727203886.98663: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203886.98665: Set connection var ansible_timeout to 10 15896 1727203886.98684: variable 'ansible_shell_executable' from source: unknown 15896 1727203886.98686: variable 'ansible_connection' from source: unknown 15896 1727203886.98689: variable 'ansible_module_compression' from source: unknown 15896 1727203886.98691: variable 'ansible_shell_type' from source: unknown 15896 1727203886.98693: variable 'ansible_shell_executable' from source: unknown 15896 1727203886.98695: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203886.98700: variable 'ansible_pipelining' from source: unknown 15896 1727203886.98702: variable 'ansible_timeout' from source: unknown 15896 1727203886.98706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203886.98807: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203886.98815: variable 'omit' from source: magic vars 15896 1727203886.98821: starting attempt loop 15896 1727203886.98824: running the handler 15896 1727203886.98932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203887.00367: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203887.00416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203887.00444: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203887.00469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203887.00490: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203887.00541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203887.00563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203887.00581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203887.00608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203887.00620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203887.00693: variable 'active_port2_profile' from source: set_fact 15896 1727203887.00708: Evaluated conditional (active_port2_profile.stdout | length == 0): True 15896 1727203887.00714: handler run complete 15896 1727203887.00725: attempt loop complete, returning result 15896 1727203887.00728: _execute() done 15896 1727203887.00732: dumping result to json 15896 1727203887.00734: done dumping result, returning 15896 1727203887.00744: done running TaskExecutor() for managed-node1/TASK: Assert that the port2 profile is not activated [028d2410-947f-fb83-b6ad-0000000000c3] 15896 1727203887.00747: sending task result for task 028d2410-947f-fb83-b6ad-0000000000c3 15896 1727203887.00828: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000c3 15896 1727203887.00831: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203887.00895: no more pending results, returning what we have 15896 1727203887.00898: results queue empty 15896 1727203887.00899: checking for any_errors_fatal 15896 1727203887.00907: done checking for any_errors_fatal 15896 1727203887.00908: checking for max_fail_percentage 15896 1727203887.00910: done checking for max_fail_percentage 15896 1727203887.00910: checking to see if all hosts have failed and the running result is not ok 15896 1727203887.00911: done checking to see if all hosts have failed 15896 1727203887.00912: getting the remaining hosts for this loop 15896 1727203887.00913: done getting the remaining hosts for this loop 15896 1727203887.00916: getting the next task for host managed-node1 15896 1727203887.00921: done getting next task for host managed-node1 15896 1727203887.00924: ^ task is: TASK: Get the port1 device state 15896 1727203887.00925: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203887.00928: getting variables 15896 1727203887.00936: in VariableManager get_vars() 15896 1727203887.00989: Calling all_inventory to load vars for managed-node1 15896 1727203887.00992: Calling groups_inventory to load vars for managed-node1 15896 1727203887.00994: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.01003: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.01006: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.01008: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.01791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.02659: done with get_vars() 15896 1727203887.02677: done getting variables 15896 1727203887.02720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Tuesday 24 September 2024 14:51:27 -0400 (0:00:00.054) 0:00:32.616 ***** 15896 1727203887.02740: entering _queue_task() for managed-node1/command 15896 1727203887.02978: worker is 1 (out of 1 available) 15896 1727203887.02992: exiting _queue_task() for managed-node1/command 15896 1727203887.03003: done queuing things up, now waiting for results queue to drain 15896 1727203887.03005: waiting for pending results... 15896 1727203887.03178: running TaskExecutor() for managed-node1/TASK: Get the port1 device state 15896 1727203887.03242: in run() - task 028d2410-947f-fb83-b6ad-0000000000c4 15896 1727203887.03256: variable 'ansible_search_path' from source: unknown 15896 1727203887.03290: calling self._execute() 15896 1727203887.03372: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.03377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.03391: variable 'omit' from source: magic vars 15896 1727203887.03670: variable 'ansible_distribution_major_version' from source: facts 15896 1727203887.03684: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203887.03758: variable 'network_provider' from source: set_fact 15896 1727203887.03765: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203887.03769: when evaluation is False, skipping this task 15896 1727203887.03774: _execute() done 15896 1727203887.03778: dumping result to json 15896 1727203887.03781: done dumping result, returning 15896 1727203887.03785: done running TaskExecutor() for managed-node1/TASK: Get the port1 device state [028d2410-947f-fb83-b6ad-0000000000c4] 15896 1727203887.03795: sending task result for task 028d2410-947f-fb83-b6ad-0000000000c4 15896 1727203887.03873: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000c4 15896 1727203887.03878: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203887.03943: no more pending results, returning what we have 15896 1727203887.03946: results queue empty 15896 1727203887.03947: checking for any_errors_fatal 15896 1727203887.03953: done checking for any_errors_fatal 15896 1727203887.03954: checking for max_fail_percentage 15896 1727203887.03955: done checking for max_fail_percentage 15896 1727203887.03956: checking to see if all hosts have failed and the running result is not ok 15896 1727203887.03957: done checking to see if all hosts have failed 15896 1727203887.03958: getting the remaining hosts for this loop 15896 1727203887.03959: done getting the remaining hosts for this loop 15896 1727203887.03962: getting the next task for host managed-node1 15896 1727203887.03967: done getting next task for host managed-node1 15896 1727203887.03969: ^ task is: TASK: Get the port2 device state 15896 1727203887.03971: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203887.03974: getting variables 15896 1727203887.03977: in VariableManager get_vars() 15896 1727203887.04020: Calling all_inventory to load vars for managed-node1 15896 1727203887.04023: Calling groups_inventory to load vars for managed-node1 15896 1727203887.04025: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.04033: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.04036: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.04038: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.04933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.05801: done with get_vars() 15896 1727203887.05816: done getting variables 15896 1727203887.05859: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Tuesday 24 September 2024 14:51:27 -0400 (0:00:00.031) 0:00:32.648 ***** 15896 1727203887.05880: entering _queue_task() for managed-node1/command 15896 1727203887.06109: worker is 1 (out of 1 available) 15896 1727203887.06123: exiting _queue_task() for managed-node1/command 15896 1727203887.06135: done queuing things up, now waiting for results queue to drain 15896 1727203887.06136: waiting for pending results... 15896 1727203887.06316: running TaskExecutor() for managed-node1/TASK: Get the port2 device state 15896 1727203887.06387: in run() - task 028d2410-947f-fb83-b6ad-0000000000c5 15896 1727203887.06399: variable 'ansible_search_path' from source: unknown 15896 1727203887.06429: calling self._execute() 15896 1727203887.06514: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.06518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.06527: variable 'omit' from source: magic vars 15896 1727203887.06806: variable 'ansible_distribution_major_version' from source: facts 15896 1727203887.06814: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203887.06892: variable 'network_provider' from source: set_fact 15896 1727203887.06895: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203887.06900: when evaluation is False, skipping this task 15896 1727203887.06904: _execute() done 15896 1727203887.06908: dumping result to json 15896 1727203887.06911: done dumping result, returning 15896 1727203887.06921: done running TaskExecutor() for managed-node1/TASK: Get the port2 device state [028d2410-947f-fb83-b6ad-0000000000c5] 15896 1727203887.06923: sending task result for task 028d2410-947f-fb83-b6ad-0000000000c5 15896 1727203887.07000: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000c5 15896 1727203887.07002: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203887.07076: no more pending results, returning what we have 15896 1727203887.07080: results queue empty 15896 1727203887.07081: checking for any_errors_fatal 15896 1727203887.07089: done checking for any_errors_fatal 15896 1727203887.07090: checking for max_fail_percentage 15896 1727203887.07092: done checking for max_fail_percentage 15896 1727203887.07093: checking to see if all hosts have failed and the running result is not ok 15896 1727203887.07094: done checking to see if all hosts have failed 15896 1727203887.07094: getting the remaining hosts for this loop 15896 1727203887.07096: done getting the remaining hosts for this loop 15896 1727203887.07099: getting the next task for host managed-node1 15896 1727203887.07104: done getting next task for host managed-node1 15896 1727203887.07106: ^ task is: TASK: Assert that the port1 device is in DOWN state 15896 1727203887.07108: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203887.07111: getting variables 15896 1727203887.07113: in VariableManager get_vars() 15896 1727203887.07156: Calling all_inventory to load vars for managed-node1 15896 1727203887.07158: Calling groups_inventory to load vars for managed-node1 15896 1727203887.07160: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.07169: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.07172: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.07174: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.07958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.08812: done with get_vars() 15896 1727203887.08826: done getting variables 15896 1727203887.08868: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Tuesday 24 September 2024 14:51:27 -0400 (0:00:00.030) 0:00:32.678 ***** 15896 1727203887.08890: entering _queue_task() for managed-node1/assert 15896 1727203887.09112: worker is 1 (out of 1 available) 15896 1727203887.09127: exiting _queue_task() for managed-node1/assert 15896 1727203887.09140: done queuing things up, now waiting for results queue to drain 15896 1727203887.09141: waiting for pending results... 15896 1727203887.09321: running TaskExecutor() for managed-node1/TASK: Assert that the port1 device is in DOWN state 15896 1727203887.09403: in run() - task 028d2410-947f-fb83-b6ad-0000000000c6 15896 1727203887.09415: variable 'ansible_search_path' from source: unknown 15896 1727203887.09442: calling self._execute() 15896 1727203887.09530: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.09533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.09542: variable 'omit' from source: magic vars 15896 1727203887.09843: variable 'ansible_distribution_major_version' from source: facts 15896 1727203887.09852: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203887.09931: variable 'network_provider' from source: set_fact 15896 1727203887.09937: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203887.09939: when evaluation is False, skipping this task 15896 1727203887.09942: _execute() done 15896 1727203887.09944: dumping result to json 15896 1727203887.09947: done dumping result, returning 15896 1727203887.09957: done running TaskExecutor() for managed-node1/TASK: Assert that the port1 device is in DOWN state [028d2410-947f-fb83-b6ad-0000000000c6] 15896 1727203887.09962: sending task result for task 028d2410-947f-fb83-b6ad-0000000000c6 15896 1727203887.10044: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000c6 15896 1727203887.10046: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203887.10103: no more pending results, returning what we have 15896 1727203887.10107: results queue empty 15896 1727203887.10108: checking for any_errors_fatal 15896 1727203887.10112: done checking for any_errors_fatal 15896 1727203887.10113: checking for max_fail_percentage 15896 1727203887.10115: done checking for max_fail_percentage 15896 1727203887.10115: checking to see if all hosts have failed and the running result is not ok 15896 1727203887.10116: done checking to see if all hosts have failed 15896 1727203887.10117: getting the remaining hosts for this loop 15896 1727203887.10118: done getting the remaining hosts for this loop 15896 1727203887.10121: getting the next task for host managed-node1 15896 1727203887.10126: done getting next task for host managed-node1 15896 1727203887.10128: ^ task is: TASK: Assert that the port2 device is in DOWN state 15896 1727203887.10131: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203887.10134: getting variables 15896 1727203887.10135: in VariableManager get_vars() 15896 1727203887.10183: Calling all_inventory to load vars for managed-node1 15896 1727203887.10185: Calling groups_inventory to load vars for managed-node1 15896 1727203887.10187: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.10196: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.10198: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.10201: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.11055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.12335: done with get_vars() 15896 1727203887.12362: done getting variables 15896 1727203887.12426: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Tuesday 24 September 2024 14:51:27 -0400 (0:00:00.035) 0:00:32.713 ***** 15896 1727203887.12455: entering _queue_task() for managed-node1/assert 15896 1727203887.12743: worker is 1 (out of 1 available) 15896 1727203887.12757: exiting _queue_task() for managed-node1/assert 15896 1727203887.12769: done queuing things up, now waiting for results queue to drain 15896 1727203887.12770: waiting for pending results... 15896 1727203887.12956: running TaskExecutor() for managed-node1/TASK: Assert that the port2 device is in DOWN state 15896 1727203887.13028: in run() - task 028d2410-947f-fb83-b6ad-0000000000c7 15896 1727203887.13041: variable 'ansible_search_path' from source: unknown 15896 1727203887.13071: calling self._execute() 15896 1727203887.13158: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.13161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.13173: variable 'omit' from source: magic vars 15896 1727203887.13450: variable 'ansible_distribution_major_version' from source: facts 15896 1727203887.13461: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203887.13542: variable 'network_provider' from source: set_fact 15896 1727203887.13548: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203887.13551: when evaluation is False, skipping this task 15896 1727203887.13553: _execute() done 15896 1727203887.13555: dumping result to json 15896 1727203887.13557: done dumping result, returning 15896 1727203887.13566: done running TaskExecutor() for managed-node1/TASK: Assert that the port2 device is in DOWN state [028d2410-947f-fb83-b6ad-0000000000c7] 15896 1727203887.13572: sending task result for task 028d2410-947f-fb83-b6ad-0000000000c7 15896 1727203887.13656: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000c7 15896 1727203887.13659: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203887.13712: no more pending results, returning what we have 15896 1727203887.13716: results queue empty 15896 1727203887.13717: checking for any_errors_fatal 15896 1727203887.13725: done checking for any_errors_fatal 15896 1727203887.13726: checking for max_fail_percentage 15896 1727203887.13728: done checking for max_fail_percentage 15896 1727203887.13729: checking to see if all hosts have failed and the running result is not ok 15896 1727203887.13729: done checking to see if all hosts have failed 15896 1727203887.13730: getting the remaining hosts for this loop 15896 1727203887.13732: done getting the remaining hosts for this loop 15896 1727203887.13735: getting the next task for host managed-node1 15896 1727203887.13742: done getting next task for host managed-node1 15896 1727203887.13748: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203887.13750: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203887.13770: getting variables 15896 1727203887.13772: in VariableManager get_vars() 15896 1727203887.13823: Calling all_inventory to load vars for managed-node1 15896 1727203887.13825: Calling groups_inventory to load vars for managed-node1 15896 1727203887.13827: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.13836: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.13838: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.13840: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.14718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.16301: done with get_vars() 15896 1727203887.16322: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:51:27 -0400 (0:00:00.039) 0:00:32.753 ***** 15896 1727203887.16417: entering _queue_task() for managed-node1/include_tasks 15896 1727203887.16716: worker is 1 (out of 1 available) 15896 1727203887.16727: exiting _queue_task() for managed-node1/include_tasks 15896 1727203887.16739: done queuing things up, now waiting for results queue to drain 15896 1727203887.16741: waiting for pending results... 15896 1727203887.17102: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203887.17188: in run() - task 028d2410-947f-fb83-b6ad-0000000000cf 15896 1727203887.17212: variable 'ansible_search_path' from source: unknown 15896 1727203887.17219: variable 'ansible_search_path' from source: unknown 15896 1727203887.17257: calling self._execute() 15896 1727203887.17365: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.17378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.17392: variable 'omit' from source: magic vars 15896 1727203887.17768: variable 'ansible_distribution_major_version' from source: facts 15896 1727203887.17786: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203887.17983: _execute() done 15896 1727203887.17987: dumping result to json 15896 1727203887.17989: done dumping result, returning 15896 1727203887.17992: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-fb83-b6ad-0000000000cf] 15896 1727203887.17994: sending task result for task 028d2410-947f-fb83-b6ad-0000000000cf 15896 1727203887.18068: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000cf 15896 1727203887.18071: WORKER PROCESS EXITING 15896 1727203887.18119: no more pending results, returning what we have 15896 1727203887.18123: in VariableManager get_vars() 15896 1727203887.18185: Calling all_inventory to load vars for managed-node1 15896 1727203887.18188: Calling groups_inventory to load vars for managed-node1 15896 1727203887.18190: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.18202: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.18205: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.18208: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.19595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.21087: done with get_vars() 15896 1727203887.21112: variable 'ansible_search_path' from source: unknown 15896 1727203887.21114: variable 'ansible_search_path' from source: unknown 15896 1727203887.21155: we have included files to process 15896 1727203887.21156: generating all_blocks data 15896 1727203887.21159: done generating all_blocks data 15896 1727203887.21166: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203887.21168: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203887.21170: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203887.21721: done processing included file 15896 1727203887.21724: iterating over new_blocks loaded from include file 15896 1727203887.21725: in VariableManager get_vars() 15896 1727203887.21758: done with get_vars() 15896 1727203887.21760: filtering new block on tags 15896 1727203887.21779: done filtering new block on tags 15896 1727203887.21782: in VariableManager get_vars() 15896 1727203887.21812: done with get_vars() 15896 1727203887.21813: filtering new block on tags 15896 1727203887.21832: done filtering new block on tags 15896 1727203887.21834: in VariableManager get_vars() 15896 1727203887.21863: done with get_vars() 15896 1727203887.21865: filtering new block on tags 15896 1727203887.21885: done filtering new block on tags 15896 1727203887.21887: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 15896 1727203887.21892: extending task lists for all hosts with included blocks 15896 1727203887.22664: done extending task lists 15896 1727203887.22666: done processing included files 15896 1727203887.22667: results queue empty 15896 1727203887.22667: checking for any_errors_fatal 15896 1727203887.22670: done checking for any_errors_fatal 15896 1727203887.22671: checking for max_fail_percentage 15896 1727203887.22672: done checking for max_fail_percentage 15896 1727203887.22673: checking to see if all hosts have failed and the running result is not ok 15896 1727203887.22673: done checking to see if all hosts have failed 15896 1727203887.22674: getting the remaining hosts for this loop 15896 1727203887.22677: done getting the remaining hosts for this loop 15896 1727203887.22679: getting the next task for host managed-node1 15896 1727203887.22683: done getting next task for host managed-node1 15896 1727203887.22686: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203887.22689: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203887.22700: getting variables 15896 1727203887.22701: in VariableManager get_vars() 15896 1727203887.22721: Calling all_inventory to load vars for managed-node1 15896 1727203887.22723: Calling groups_inventory to load vars for managed-node1 15896 1727203887.22725: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.22730: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.22733: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.22735: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.24011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.25509: done with get_vars() 15896 1727203887.25540: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:51:27 -0400 (0:00:00.092) 0:00:32.845 ***** 15896 1727203887.25631: entering _queue_task() for managed-node1/setup 15896 1727203887.26055: worker is 1 (out of 1 available) 15896 1727203887.26067: exiting _queue_task() for managed-node1/setup 15896 1727203887.26489: done queuing things up, now waiting for results queue to drain 15896 1727203887.26492: waiting for pending results... 15896 1727203887.26839: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203887.27014: in run() - task 028d2410-947f-fb83-b6ad-000000000796 15896 1727203887.27114: variable 'ansible_search_path' from source: unknown 15896 1727203887.27157: variable 'ansible_search_path' from source: unknown 15896 1727203887.27200: calling self._execute() 15896 1727203887.27348: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.27384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.27400: variable 'omit' from source: magic vars 15896 1727203887.27796: variable 'ansible_distribution_major_version' from source: facts 15896 1727203887.27881: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203887.28022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203887.30328: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203887.30417: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203887.30458: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203887.30498: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203887.30533: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203887.30615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203887.30654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203887.30684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203887.30730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203887.30781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203887.30811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203887.30838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203887.30865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203887.30980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203887.30984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203887.31097: variable '__network_required_facts' from source: role '' defaults 15896 1727203887.31116: variable 'ansible_facts' from source: unknown 15896 1727203887.32479: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15896 1727203887.32517: when evaluation is False, skipping this task 15896 1727203887.32684: _execute() done 15896 1727203887.32687: dumping result to json 15896 1727203887.32690: done dumping result, returning 15896 1727203887.32693: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-fb83-b6ad-000000000796] 15896 1727203887.32695: sending task result for task 028d2410-947f-fb83-b6ad-000000000796 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203887.32928: no more pending results, returning what we have 15896 1727203887.32934: results queue empty 15896 1727203887.32935: checking for any_errors_fatal 15896 1727203887.32936: done checking for any_errors_fatal 15896 1727203887.32937: checking for max_fail_percentage 15896 1727203887.32939: done checking for max_fail_percentage 15896 1727203887.32940: checking to see if all hosts have failed and the running result is not ok 15896 1727203887.32941: done checking to see if all hosts have failed 15896 1727203887.32942: getting the remaining hosts for this loop 15896 1727203887.32943: done getting the remaining hosts for this loop 15896 1727203887.32947: getting the next task for host managed-node1 15896 1727203887.32958: done getting next task for host managed-node1 15896 1727203887.32962: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203887.32966: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203887.32991: getting variables 15896 1727203887.32994: in VariableManager get_vars() 15896 1727203887.33062: Calling all_inventory to load vars for managed-node1 15896 1727203887.33065: Calling groups_inventory to load vars for managed-node1 15896 1727203887.33068: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.33286: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.33290: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.33295: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.34183: done sending task result for task 028d2410-947f-fb83-b6ad-000000000796 15896 1727203887.34187: WORKER PROCESS EXITING 15896 1727203887.35946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.39119: done with get_vars() 15896 1727203887.39149: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:51:27 -0400 (0:00:00.138) 0:00:32.983 ***** 15896 1727203887.39461: entering _queue_task() for managed-node1/stat 15896 1727203887.40232: worker is 1 (out of 1 available) 15896 1727203887.40246: exiting _queue_task() for managed-node1/stat 15896 1727203887.40261: done queuing things up, now waiting for results queue to drain 15896 1727203887.40263: waiting for pending results... 15896 1727203887.40760: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203887.41100: in run() - task 028d2410-947f-fb83-b6ad-000000000798 15896 1727203887.41114: variable 'ansible_search_path' from source: unknown 15896 1727203887.41117: variable 'ansible_search_path' from source: unknown 15896 1727203887.41270: calling self._execute() 15896 1727203887.41507: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.41512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.41523: variable 'omit' from source: magic vars 15896 1727203887.42886: variable 'ansible_distribution_major_version' from source: facts 15896 1727203887.42898: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203887.43058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203887.44081: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203887.44085: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203887.44087: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203887.44090: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203887.44328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203887.44358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203887.44395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203887.44425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203887.44883: variable '__network_is_ostree' from source: set_fact 15896 1727203887.44886: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203887.44889: when evaluation is False, skipping this task 15896 1727203887.44891: _execute() done 15896 1727203887.44893: dumping result to json 15896 1727203887.44896: done dumping result, returning 15896 1727203887.44899: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-fb83-b6ad-000000000798] 15896 1727203887.44901: sending task result for task 028d2410-947f-fb83-b6ad-000000000798 15896 1727203887.44974: done sending task result for task 028d2410-947f-fb83-b6ad-000000000798 15896 1727203887.44980: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203887.45042: no more pending results, returning what we have 15896 1727203887.45047: results queue empty 15896 1727203887.45047: checking for any_errors_fatal 15896 1727203887.45055: done checking for any_errors_fatal 15896 1727203887.45055: checking for max_fail_percentage 15896 1727203887.45058: done checking for max_fail_percentage 15896 1727203887.45059: checking to see if all hosts have failed and the running result is not ok 15896 1727203887.45059: done checking to see if all hosts have failed 15896 1727203887.45060: getting the remaining hosts for this loop 15896 1727203887.45062: done getting the remaining hosts for this loop 15896 1727203887.45065: getting the next task for host managed-node1 15896 1727203887.45072: done getting next task for host managed-node1 15896 1727203887.45078: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203887.45082: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203887.45102: getting variables 15896 1727203887.45104: in VariableManager get_vars() 15896 1727203887.45157: Calling all_inventory to load vars for managed-node1 15896 1727203887.45159: Calling groups_inventory to load vars for managed-node1 15896 1727203887.45161: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.45172: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.45174: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.45382: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.47950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.49917: done with get_vars() 15896 1727203887.49946: done getting variables 15896 1727203887.50017: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:51:27 -0400 (0:00:00.105) 0:00:33.089 ***** 15896 1727203887.50057: entering _queue_task() for managed-node1/set_fact 15896 1727203887.50447: worker is 1 (out of 1 available) 15896 1727203887.50459: exiting _queue_task() for managed-node1/set_fact 15896 1727203887.50471: done queuing things up, now waiting for results queue to drain 15896 1727203887.50473: waiting for pending results... 15896 1727203887.50799: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203887.51138: in run() - task 028d2410-947f-fb83-b6ad-000000000799 15896 1727203887.51159: variable 'ansible_search_path' from source: unknown 15896 1727203887.51169: variable 'ansible_search_path' from source: unknown 15896 1727203887.51215: calling self._execute() 15896 1727203887.51502: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.51517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.51531: variable 'omit' from source: magic vars 15896 1727203887.52310: variable 'ansible_distribution_major_version' from source: facts 15896 1727203887.52330: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203887.52549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203887.52847: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203887.52905: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203887.52943: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203887.52990: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203887.53080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203887.53114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203887.53143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203887.53176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203887.53274: variable '__network_is_ostree' from source: set_fact 15896 1727203887.53289: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203887.53296: when evaluation is False, skipping this task 15896 1727203887.53306: _execute() done 15896 1727203887.53314: dumping result to json 15896 1727203887.53321: done dumping result, returning 15896 1727203887.53333: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-fb83-b6ad-000000000799] 15896 1727203887.53342: sending task result for task 028d2410-947f-fb83-b6ad-000000000799 15896 1727203887.53558: done sending task result for task 028d2410-947f-fb83-b6ad-000000000799 15896 1727203887.53566: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203887.53614: no more pending results, returning what we have 15896 1727203887.53618: results queue empty 15896 1727203887.53619: checking for any_errors_fatal 15896 1727203887.53624: done checking for any_errors_fatal 15896 1727203887.53625: checking for max_fail_percentage 15896 1727203887.53627: done checking for max_fail_percentage 15896 1727203887.53628: checking to see if all hosts have failed and the running result is not ok 15896 1727203887.53629: done checking to see if all hosts have failed 15896 1727203887.53629: getting the remaining hosts for this loop 15896 1727203887.53631: done getting the remaining hosts for this loop 15896 1727203887.53634: getting the next task for host managed-node1 15896 1727203887.53643: done getting next task for host managed-node1 15896 1727203887.53646: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203887.53650: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203887.53671: getting variables 15896 1727203887.53673: in VariableManager get_vars() 15896 1727203887.53722: Calling all_inventory to load vars for managed-node1 15896 1727203887.53724: Calling groups_inventory to load vars for managed-node1 15896 1727203887.53726: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203887.53735: Calling all_plugins_play to load vars for managed-node1 15896 1727203887.53738: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203887.53740: Calling groups_plugins_play to load vars for managed-node1 15896 1727203887.55440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203887.57388: done with get_vars() 15896 1727203887.57409: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:51:27 -0400 (0:00:00.074) 0:00:33.164 ***** 15896 1727203887.57512: entering _queue_task() for managed-node1/service_facts 15896 1727203887.58007: worker is 1 (out of 1 available) 15896 1727203887.58019: exiting _queue_task() for managed-node1/service_facts 15896 1727203887.58030: done queuing things up, now waiting for results queue to drain 15896 1727203887.58031: waiting for pending results... 15896 1727203887.58582: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203887.58883: in run() - task 028d2410-947f-fb83-b6ad-00000000079b 15896 1727203887.58888: variable 'ansible_search_path' from source: unknown 15896 1727203887.58894: variable 'ansible_search_path' from source: unknown 15896 1727203887.58981: calling self._execute() 15896 1727203887.59047: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.59058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.59073: variable 'omit' from source: magic vars 15896 1727203887.60101: variable 'ansible_distribution_major_version' from source: facts 15896 1727203887.60181: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203887.60184: variable 'omit' from source: magic vars 15896 1727203887.60499: variable 'omit' from source: magic vars 15896 1727203887.60503: variable 'omit' from source: magic vars 15896 1727203887.60556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203887.60628: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203887.60654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203887.60700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203887.60727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203887.60765: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203887.60809: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.60812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.60898: Set connection var ansible_shell_type to sh 15896 1727203887.60915: Set connection var ansible_connection to ssh 15896 1727203887.60926: Set connection var ansible_shell_executable to /bin/sh 15896 1727203887.60983: Set connection var ansible_pipelining to False 15896 1727203887.60986: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203887.60989: Set connection var ansible_timeout to 10 15896 1727203887.60991: variable 'ansible_shell_executable' from source: unknown 15896 1727203887.60993: variable 'ansible_connection' from source: unknown 15896 1727203887.61001: variable 'ansible_module_compression' from source: unknown 15896 1727203887.61005: variable 'ansible_shell_type' from source: unknown 15896 1727203887.61011: variable 'ansible_shell_executable' from source: unknown 15896 1727203887.61017: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203887.61031: variable 'ansible_pipelining' from source: unknown 15896 1727203887.61037: variable 'ansible_timeout' from source: unknown 15896 1727203887.61045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203887.61271: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203887.61290: variable 'omit' from source: magic vars 15896 1727203887.61309: starting attempt loop 15896 1727203887.61312: running the handler 15896 1727203887.61353: _low_level_execute_command(): starting 15896 1727203887.61356: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203887.62092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203887.62182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203887.62210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203887.62297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203887.62317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203887.62353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203887.62462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203887.64280: stdout chunk (state=3): >>>/root <<< 15896 1727203887.64425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203887.64445: stderr chunk (state=3): >>><<< 15896 1727203887.64455: stdout chunk (state=3): >>><<< 15896 1727203887.64583: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203887.64588: _low_level_execute_command(): starting 15896 1727203887.64591: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621 `" && echo ansible-tmp-1727203887.6448865-18344-4171655531621="` echo /root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621 `" ) && sleep 0' 15896 1727203887.65110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203887.65114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203887.65199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203887.65229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203887.65250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203887.65301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203887.65409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203887.67535: stdout chunk (state=3): >>>ansible-tmp-1727203887.6448865-18344-4171655531621=/root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621 <<< 15896 1727203887.67788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203887.67792: stdout chunk (state=3): >>><<< 15896 1727203887.67794: stderr chunk (state=3): >>><<< 15896 1727203887.67797: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203887.6448865-18344-4171655531621=/root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203887.67800: variable 'ansible_module_compression' from source: unknown 15896 1727203887.67802: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15896 1727203887.67838: variable 'ansible_facts' from source: unknown 15896 1727203887.67932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/AnsiballZ_service_facts.py 15896 1727203887.68145: Sending initial data 15896 1727203887.68148: Sent initial data (160 bytes) 15896 1727203887.68567: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203887.68574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203887.68614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203887.68618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203887.68620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203887.68622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203887.68626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203887.68628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203887.68669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203887.68689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203887.68769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203887.70539: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203887.70609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203887.70703: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmptfj3o7vt /root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/AnsiballZ_service_facts.py <<< 15896 1727203887.70718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/AnsiballZ_service_facts.py" <<< 15896 1727203887.70788: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmptfj3o7vt" to remote "/root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/AnsiballZ_service_facts.py" <<< 15896 1727203887.70791: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/AnsiballZ_service_facts.py" <<< 15896 1727203887.71703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203887.71757: stderr chunk (state=3): >>><<< 15896 1727203887.71762: stdout chunk (state=3): >>><<< 15896 1727203887.71784: done transferring module to remote 15896 1727203887.71798: _low_level_execute_command(): starting 15896 1727203887.71819: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/ /root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/AnsiballZ_service_facts.py && sleep 0' 15896 1727203887.72268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203887.72271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203887.72273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203887.72281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203887.72328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203887.72331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203887.72335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203887.72412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203887.74370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203887.74395: stderr chunk (state=3): >>><<< 15896 1727203887.74398: stdout chunk (state=3): >>><<< 15896 1727203887.74411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203887.74414: _low_level_execute_command(): starting 15896 1727203887.74419: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/AnsiballZ_service_facts.py && sleep 0' 15896 1727203887.74833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203887.74860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203887.74864: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203887.74866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203887.74868: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203887.74870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203887.74928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203887.74935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203887.74937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203887.75023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203889.52954: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 15896 1727203889.52973: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 15896 1727203889.52990: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 15896 1727203889.53021: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 15896 1727203889.53026: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15896 1727203889.54850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203889.54883: stderr chunk (state=3): >>><<< 15896 1727203889.54887: stdout chunk (state=3): >>><<< 15896 1727203889.54909: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203889.55342: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203889.55350: _low_level_execute_command(): starting 15896 1727203889.55355: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203887.6448865-18344-4171655531621/ > /dev/null 2>&1 && sleep 0' 15896 1727203889.55821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203889.55825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203889.55828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203889.55830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203889.55832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203889.55888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203889.55891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203889.55893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203889.55980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203889.57980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203889.58008: stderr chunk (state=3): >>><<< 15896 1727203889.58011: stdout chunk (state=3): >>><<< 15896 1727203889.58024: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203889.58029: handler run complete 15896 1727203889.58148: variable 'ansible_facts' from source: unknown 15896 1727203889.58245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203889.58523: variable 'ansible_facts' from source: unknown 15896 1727203889.58606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203889.58721: attempt loop complete, returning result 15896 1727203889.58724: _execute() done 15896 1727203889.58726: dumping result to json 15896 1727203889.58767: done dumping result, returning 15896 1727203889.58777: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-fb83-b6ad-00000000079b] 15896 1727203889.58780: sending task result for task 028d2410-947f-fb83-b6ad-00000000079b ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203889.59381: no more pending results, returning what we have 15896 1727203889.59388: results queue empty 15896 1727203889.59389: checking for any_errors_fatal 15896 1727203889.59394: done checking for any_errors_fatal 15896 1727203889.59395: checking for max_fail_percentage 15896 1727203889.59396: done checking for max_fail_percentage 15896 1727203889.59397: checking to see if all hosts have failed and the running result is not ok 15896 1727203889.59398: done checking to see if all hosts have failed 15896 1727203889.59398: getting the remaining hosts for this loop 15896 1727203889.59401: done getting the remaining hosts for this loop 15896 1727203889.59404: getting the next task for host managed-node1 15896 1727203889.59409: done getting next task for host managed-node1 15896 1727203889.59412: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203889.59416: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203889.59426: done sending task result for task 028d2410-947f-fb83-b6ad-00000000079b 15896 1727203889.59428: WORKER PROCESS EXITING 15896 1727203889.59434: getting variables 15896 1727203889.59435: in VariableManager get_vars() 15896 1727203889.59468: Calling all_inventory to load vars for managed-node1 15896 1727203889.59470: Calling groups_inventory to load vars for managed-node1 15896 1727203889.59471: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203889.59480: Calling all_plugins_play to load vars for managed-node1 15896 1727203889.59482: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203889.59484: Calling groups_plugins_play to load vars for managed-node1 15896 1727203889.60261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203889.64730: done with get_vars() 15896 1727203889.64750: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:51:29 -0400 (0:00:02.073) 0:00:35.237 ***** 15896 1727203889.64817: entering _queue_task() for managed-node1/package_facts 15896 1727203889.65090: worker is 1 (out of 1 available) 15896 1727203889.65103: exiting _queue_task() for managed-node1/package_facts 15896 1727203889.65114: done queuing things up, now waiting for results queue to drain 15896 1727203889.65116: waiting for pending results... 15896 1727203889.65307: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203889.65414: in run() - task 028d2410-947f-fb83-b6ad-00000000079c 15896 1727203889.65426: variable 'ansible_search_path' from source: unknown 15896 1727203889.65430: variable 'ansible_search_path' from source: unknown 15896 1727203889.65459: calling self._execute() 15896 1727203889.65548: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203889.65558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203889.65571: variable 'omit' from source: magic vars 15896 1727203889.65854: variable 'ansible_distribution_major_version' from source: facts 15896 1727203889.65864: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203889.65872: variable 'omit' from source: magic vars 15896 1727203889.65923: variable 'omit' from source: magic vars 15896 1727203889.65946: variable 'omit' from source: magic vars 15896 1727203889.65982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203889.66011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203889.66026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203889.66040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203889.66050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203889.66078: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203889.66081: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203889.66084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203889.66152: Set connection var ansible_shell_type to sh 15896 1727203889.66158: Set connection var ansible_connection to ssh 15896 1727203889.66166: Set connection var ansible_shell_executable to /bin/sh 15896 1727203889.66171: Set connection var ansible_pipelining to False 15896 1727203889.66178: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203889.66184: Set connection var ansible_timeout to 10 15896 1727203889.66201: variable 'ansible_shell_executable' from source: unknown 15896 1727203889.66204: variable 'ansible_connection' from source: unknown 15896 1727203889.66207: variable 'ansible_module_compression' from source: unknown 15896 1727203889.66210: variable 'ansible_shell_type' from source: unknown 15896 1727203889.66214: variable 'ansible_shell_executable' from source: unknown 15896 1727203889.66217: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203889.66219: variable 'ansible_pipelining' from source: unknown 15896 1727203889.66222: variable 'ansible_timeout' from source: unknown 15896 1727203889.66224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203889.66370: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203889.66381: variable 'omit' from source: magic vars 15896 1727203889.66385: starting attempt loop 15896 1727203889.66388: running the handler 15896 1727203889.66400: _low_level_execute_command(): starting 15896 1727203889.66407: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203889.66927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203889.66932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203889.66935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203889.66937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203889.66979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203889.67001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203889.67085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203889.68880: stdout chunk (state=3): >>>/root <<< 15896 1727203889.68981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203889.69008: stderr chunk (state=3): >>><<< 15896 1727203889.69011: stdout chunk (state=3): >>><<< 15896 1727203889.69034: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203889.69048: _low_level_execute_command(): starting 15896 1727203889.69054: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103 `" && echo ansible-tmp-1727203889.6903403-18438-274004965043103="` echo /root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103 `" ) && sleep 0' 15896 1727203889.69502: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203889.69505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203889.69508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203889.69517: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203889.69519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203889.69561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203889.69564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203889.69569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203889.69651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203889.71786: stdout chunk (state=3): >>>ansible-tmp-1727203889.6903403-18438-274004965043103=/root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103 <<< 15896 1727203889.71898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203889.71923: stderr chunk (state=3): >>><<< 15896 1727203889.71927: stdout chunk (state=3): >>><<< 15896 1727203889.71939: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203889.6903403-18438-274004965043103=/root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203889.71983: variable 'ansible_module_compression' from source: unknown 15896 1727203889.72019: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15896 1727203889.72074: variable 'ansible_facts' from source: unknown 15896 1727203889.72195: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/AnsiballZ_package_facts.py 15896 1727203889.72293: Sending initial data 15896 1727203889.72296: Sent initial data (162 bytes) 15896 1727203889.72746: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203889.72749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203889.72752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203889.72755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203889.72810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203889.72816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203889.72818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203889.72897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203889.74659: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15896 1727203889.74663: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203889.74731: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203889.74808: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp7mc5vzc7 /root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/AnsiballZ_package_facts.py <<< 15896 1727203889.74812: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/AnsiballZ_package_facts.py" <<< 15896 1727203889.74885: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp7mc5vzc7" to remote "/root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/AnsiballZ_package_facts.py" <<< 15896 1727203889.74888: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/AnsiballZ_package_facts.py" <<< 15896 1727203889.76382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203889.76386: stderr chunk (state=3): >>><<< 15896 1727203889.76389: stdout chunk (state=3): >>><<< 15896 1727203889.76391: done transferring module to remote 15896 1727203889.76393: _low_level_execute_command(): starting 15896 1727203889.76395: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/ /root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/AnsiballZ_package_facts.py && sleep 0' 15896 1727203889.76981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203889.76985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203889.77000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203889.77019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203889.77034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203889.77047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203889.77085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203889.77128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203889.77140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203889.77223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203889.79327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203889.79351: stderr chunk (state=3): >>><<< 15896 1727203889.79358: stdout chunk (state=3): >>><<< 15896 1727203889.79481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203889.79490: _low_level_execute_command(): starting 15896 1727203889.79499: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/AnsiballZ_package_facts.py && sleep 0' 15896 1727203889.80109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203889.80127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203889.80145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203889.80259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203889.80274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203889.80290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203889.80306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203889.80426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203890.28060: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 15896 1727203890.28082: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 15896 1727203890.28138: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 15896 1727203890.28153: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el1<<< 15896 1727203890.28181: stdout chunk (state=3): >>>0", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 15896 1727203890.28233: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name":<<< 15896 1727203890.28238: stdout chunk (state=3): >>> "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 15896 1727203890.28243: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 15896 1727203890.28246: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cl<<< 15896 1727203890.28270: stdout chunk (state=3): >>>oud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15896 1727203890.30393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203890.30425: stderr chunk (state=3): >>><<< 15896 1727203890.30428: stdout chunk (state=3): >>><<< 15896 1727203890.30467: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203890.31655: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203890.31677: _low_level_execute_command(): starting 15896 1727203890.31682: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203889.6903403-18438-274004965043103/ > /dev/null 2>&1 && sleep 0' 15896 1727203890.32141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203890.32144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203890.32147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203890.32149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203890.32151: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203890.32206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203890.32209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203890.32211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203890.32297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203890.34285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203890.34314: stderr chunk (state=3): >>><<< 15896 1727203890.34317: stdout chunk (state=3): >>><<< 15896 1727203890.34329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203890.34336: handler run complete 15896 1727203890.34887: variable 'ansible_facts' from source: unknown 15896 1727203890.35148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.36189: variable 'ansible_facts' from source: unknown 15896 1727203890.36434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.36811: attempt loop complete, returning result 15896 1727203890.36821: _execute() done 15896 1727203890.36824: dumping result to json 15896 1727203890.36938: done dumping result, returning 15896 1727203890.36945: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-fb83-b6ad-00000000079c] 15896 1727203890.36949: sending task result for task 028d2410-947f-fb83-b6ad-00000000079c ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203890.38348: done sending task result for task 028d2410-947f-fb83-b6ad-00000000079c 15896 1727203890.38351: WORKER PROCESS EXITING 15896 1727203890.38362: no more pending results, returning what we have 15896 1727203890.38364: results queue empty 15896 1727203890.38364: checking for any_errors_fatal 15896 1727203890.38369: done checking for any_errors_fatal 15896 1727203890.38369: checking for max_fail_percentage 15896 1727203890.38370: done checking for max_fail_percentage 15896 1727203890.38371: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.38371: done checking to see if all hosts have failed 15896 1727203890.38372: getting the remaining hosts for this loop 15896 1727203890.38372: done getting the remaining hosts for this loop 15896 1727203890.38377: getting the next task for host managed-node1 15896 1727203890.38382: done getting next task for host managed-node1 15896 1727203890.38384: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203890.38386: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.38394: getting variables 15896 1727203890.38395: in VariableManager get_vars() 15896 1727203890.38426: Calling all_inventory to load vars for managed-node1 15896 1727203890.38427: Calling groups_inventory to load vars for managed-node1 15896 1727203890.38429: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.38435: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.38437: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.38439: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.39160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.40016: done with get_vars() 15896 1727203890.40033: done getting variables 15896 1727203890.40079: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.752) 0:00:35.990 ***** 15896 1727203890.40108: entering _queue_task() for managed-node1/debug 15896 1727203890.40355: worker is 1 (out of 1 available) 15896 1727203890.40368: exiting _queue_task() for managed-node1/debug 15896 1727203890.40383: done queuing things up, now waiting for results queue to drain 15896 1727203890.40385: waiting for pending results... 15896 1727203890.40570: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203890.40677: in run() - task 028d2410-947f-fb83-b6ad-0000000000d0 15896 1727203890.40688: variable 'ansible_search_path' from source: unknown 15896 1727203890.40692: variable 'ansible_search_path' from source: unknown 15896 1727203890.40723: calling self._execute() 15896 1727203890.40805: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.40809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.40818: variable 'omit' from source: magic vars 15896 1727203890.41111: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.41120: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.41126: variable 'omit' from source: magic vars 15896 1727203890.41170: variable 'omit' from source: magic vars 15896 1727203890.41246: variable 'network_provider' from source: set_fact 15896 1727203890.41266: variable 'omit' from source: magic vars 15896 1727203890.41300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203890.41326: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203890.41342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203890.41356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203890.41369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203890.41394: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203890.41398: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.41401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.41470: Set connection var ansible_shell_type to sh 15896 1727203890.41478: Set connection var ansible_connection to ssh 15896 1727203890.41486: Set connection var ansible_shell_executable to /bin/sh 15896 1727203890.41489: Set connection var ansible_pipelining to False 15896 1727203890.41494: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203890.41499: Set connection var ansible_timeout to 10 15896 1727203890.41517: variable 'ansible_shell_executable' from source: unknown 15896 1727203890.41520: variable 'ansible_connection' from source: unknown 15896 1727203890.41523: variable 'ansible_module_compression' from source: unknown 15896 1727203890.41525: variable 'ansible_shell_type' from source: unknown 15896 1727203890.41528: variable 'ansible_shell_executable' from source: unknown 15896 1727203890.41530: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.41532: variable 'ansible_pipelining' from source: unknown 15896 1727203890.41535: variable 'ansible_timeout' from source: unknown 15896 1727203890.41539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.41641: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203890.41650: variable 'omit' from source: magic vars 15896 1727203890.41655: starting attempt loop 15896 1727203890.41657: running the handler 15896 1727203890.41698: handler run complete 15896 1727203890.41710: attempt loop complete, returning result 15896 1727203890.41713: _execute() done 15896 1727203890.41715: dumping result to json 15896 1727203890.41717: done dumping result, returning 15896 1727203890.41725: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-fb83-b6ad-0000000000d0] 15896 1727203890.41729: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d0 15896 1727203890.41812: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d0 15896 1727203890.41815: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 15896 1727203890.41882: no more pending results, returning what we have 15896 1727203890.41885: results queue empty 15896 1727203890.41886: checking for any_errors_fatal 15896 1727203890.41896: done checking for any_errors_fatal 15896 1727203890.41897: checking for max_fail_percentage 15896 1727203890.41899: done checking for max_fail_percentage 15896 1727203890.41899: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.41900: done checking to see if all hosts have failed 15896 1727203890.41900: getting the remaining hosts for this loop 15896 1727203890.41902: done getting the remaining hosts for this loop 15896 1727203890.41905: getting the next task for host managed-node1 15896 1727203890.41911: done getting next task for host managed-node1 15896 1727203890.41915: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203890.41917: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.41931: getting variables 15896 1727203890.41933: in VariableManager get_vars() 15896 1727203890.41987: Calling all_inventory to load vars for managed-node1 15896 1727203890.41989: Calling groups_inventory to load vars for managed-node1 15896 1727203890.41991: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.42000: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.42002: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.42004: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.42785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.43657: done with get_vars() 15896 1727203890.43678: done getting variables 15896 1727203890.43723: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.036) 0:00:36.026 ***** 15896 1727203890.43748: entering _queue_task() for managed-node1/fail 15896 1727203890.44005: worker is 1 (out of 1 available) 15896 1727203890.44018: exiting _queue_task() for managed-node1/fail 15896 1727203890.44031: done queuing things up, now waiting for results queue to drain 15896 1727203890.44032: waiting for pending results... 15896 1727203890.44221: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203890.44315: in run() - task 028d2410-947f-fb83-b6ad-0000000000d1 15896 1727203890.44325: variable 'ansible_search_path' from source: unknown 15896 1727203890.44329: variable 'ansible_search_path' from source: unknown 15896 1727203890.44357: calling self._execute() 15896 1727203890.44449: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.44453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.44463: variable 'omit' from source: magic vars 15896 1727203890.44752: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.44761: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.44851: variable 'network_state' from source: role '' defaults 15896 1727203890.44860: Evaluated conditional (network_state != {}): False 15896 1727203890.44866: when evaluation is False, skipping this task 15896 1727203890.44869: _execute() done 15896 1727203890.44872: dumping result to json 15896 1727203890.44875: done dumping result, returning 15896 1727203890.44883: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-fb83-b6ad-0000000000d1] 15896 1727203890.44887: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d1 15896 1727203890.44974: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d1 15896 1727203890.44980: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203890.45055: no more pending results, returning what we have 15896 1727203890.45058: results queue empty 15896 1727203890.45059: checking for any_errors_fatal 15896 1727203890.45066: done checking for any_errors_fatal 15896 1727203890.45066: checking for max_fail_percentage 15896 1727203890.45068: done checking for max_fail_percentage 15896 1727203890.45069: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.45070: done checking to see if all hosts have failed 15896 1727203890.45071: getting the remaining hosts for this loop 15896 1727203890.45072: done getting the remaining hosts for this loop 15896 1727203890.45077: getting the next task for host managed-node1 15896 1727203890.45082: done getting next task for host managed-node1 15896 1727203890.45086: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203890.45088: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.45106: getting variables 15896 1727203890.45108: in VariableManager get_vars() 15896 1727203890.45149: Calling all_inventory to load vars for managed-node1 15896 1727203890.45151: Calling groups_inventory to load vars for managed-node1 15896 1727203890.45153: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.45161: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.45163: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.45166: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.46031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.46878: done with get_vars() 15896 1727203890.46895: done getting variables 15896 1727203890.46937: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.032) 0:00:36.058 ***** 15896 1727203890.46961: entering _queue_task() for managed-node1/fail 15896 1727203890.47213: worker is 1 (out of 1 available) 15896 1727203890.47227: exiting _queue_task() for managed-node1/fail 15896 1727203890.47239: done queuing things up, now waiting for results queue to drain 15896 1727203890.47240: waiting for pending results... 15896 1727203890.47427: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203890.47524: in run() - task 028d2410-947f-fb83-b6ad-0000000000d2 15896 1727203890.47535: variable 'ansible_search_path' from source: unknown 15896 1727203890.47538: variable 'ansible_search_path' from source: unknown 15896 1727203890.47565: calling self._execute() 15896 1727203890.47653: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.47658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.47667: variable 'omit' from source: magic vars 15896 1727203890.47957: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.47967: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.48050: variable 'network_state' from source: role '' defaults 15896 1727203890.48059: Evaluated conditional (network_state != {}): False 15896 1727203890.48066: when evaluation is False, skipping this task 15896 1727203890.48069: _execute() done 15896 1727203890.48071: dumping result to json 15896 1727203890.48074: done dumping result, returning 15896 1727203890.48078: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-fb83-b6ad-0000000000d2] 15896 1727203890.48084: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d2 15896 1727203890.48169: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d2 15896 1727203890.48172: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203890.48218: no more pending results, returning what we have 15896 1727203890.48222: results queue empty 15896 1727203890.48223: checking for any_errors_fatal 15896 1727203890.48231: done checking for any_errors_fatal 15896 1727203890.48231: checking for max_fail_percentage 15896 1727203890.48233: done checking for max_fail_percentage 15896 1727203890.48234: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.48235: done checking to see if all hosts have failed 15896 1727203890.48235: getting the remaining hosts for this loop 15896 1727203890.48237: done getting the remaining hosts for this loop 15896 1727203890.48240: getting the next task for host managed-node1 15896 1727203890.48246: done getting next task for host managed-node1 15896 1727203890.48249: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203890.48252: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.48279: getting variables 15896 1727203890.48281: in VariableManager get_vars() 15896 1727203890.48325: Calling all_inventory to load vars for managed-node1 15896 1727203890.48328: Calling groups_inventory to load vars for managed-node1 15896 1727203890.48329: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.48338: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.48340: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.48343: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.49103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.49980: done with get_vars() 15896 1727203890.49999: done getting variables 15896 1727203890.50042: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.031) 0:00:36.090 ***** 15896 1727203890.50070: entering _queue_task() for managed-node1/fail 15896 1727203890.50329: worker is 1 (out of 1 available) 15896 1727203890.50344: exiting _queue_task() for managed-node1/fail 15896 1727203890.50357: done queuing things up, now waiting for results queue to drain 15896 1727203890.50358: waiting for pending results... 15896 1727203890.50542: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203890.50639: in run() - task 028d2410-947f-fb83-b6ad-0000000000d3 15896 1727203890.50650: variable 'ansible_search_path' from source: unknown 15896 1727203890.50654: variable 'ansible_search_path' from source: unknown 15896 1727203890.50684: calling self._execute() 15896 1727203890.50781: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.50785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.50787: variable 'omit' from source: magic vars 15896 1727203890.51054: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.51065: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.51181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203890.52892: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203890.52934: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203890.52961: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203890.52991: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203890.53011: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203890.53070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.53093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.53111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.53137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.53147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.53218: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.53230: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15896 1727203890.53310: variable 'ansible_distribution' from source: facts 15896 1727203890.53314: variable '__network_rh_distros' from source: role '' defaults 15896 1727203890.53322: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15896 1727203890.53492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.53509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.53529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.53554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.53567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.53601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.53617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.53634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.53663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.53672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.53703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.53720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.53736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.53766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.53776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.53972: variable 'network_connections' from source: task vars 15896 1727203890.53983: variable 'controller_profile' from source: play vars 15896 1727203890.54027: variable 'controller_profile' from source: play vars 15896 1727203890.54035: variable 'controller_device' from source: play vars 15896 1727203890.54081: variable 'controller_device' from source: play vars 15896 1727203890.54090: variable 'port1_profile' from source: play vars 15896 1727203890.54130: variable 'port1_profile' from source: play vars 15896 1727203890.54136: variable 'dhcp_interface1' from source: play vars 15896 1727203890.54183: variable 'dhcp_interface1' from source: play vars 15896 1727203890.54188: variable 'controller_profile' from source: play vars 15896 1727203890.54228: variable 'controller_profile' from source: play vars 15896 1727203890.54234: variable 'port2_profile' from source: play vars 15896 1727203890.54279: variable 'port2_profile' from source: play vars 15896 1727203890.54286: variable 'dhcp_interface2' from source: play vars 15896 1727203890.54327: variable 'dhcp_interface2' from source: play vars 15896 1727203890.54333: variable 'controller_profile' from source: play vars 15896 1727203890.54380: variable 'controller_profile' from source: play vars 15896 1727203890.54386: variable 'network_state' from source: role '' defaults 15896 1727203890.54432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203890.54541: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203890.54570: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203890.54593: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203890.54620: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203890.54646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203890.54663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203890.54684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.54702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203890.54732: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15896 1727203890.54735: when evaluation is False, skipping this task 15896 1727203890.54738: _execute() done 15896 1727203890.54740: dumping result to json 15896 1727203890.54742: done dumping result, returning 15896 1727203890.54750: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-fb83-b6ad-0000000000d3] 15896 1727203890.54755: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d3 15896 1727203890.54842: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d3 15896 1727203890.54844: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15896 1727203890.54891: no more pending results, returning what we have 15896 1727203890.54894: results queue empty 15896 1727203890.54895: checking for any_errors_fatal 15896 1727203890.54900: done checking for any_errors_fatal 15896 1727203890.54900: checking for max_fail_percentage 15896 1727203890.54902: done checking for max_fail_percentage 15896 1727203890.54903: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.54903: done checking to see if all hosts have failed 15896 1727203890.54904: getting the remaining hosts for this loop 15896 1727203890.54905: done getting the remaining hosts for this loop 15896 1727203890.54909: getting the next task for host managed-node1 15896 1727203890.54915: done getting next task for host managed-node1 15896 1727203890.54919: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203890.54921: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.54941: getting variables 15896 1727203890.54942: in VariableManager get_vars() 15896 1727203890.54996: Calling all_inventory to load vars for managed-node1 15896 1727203890.54999: Calling groups_inventory to load vars for managed-node1 15896 1727203890.55001: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.55011: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.55013: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.55015: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.55938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.56791: done with get_vars() 15896 1727203890.56809: done getting variables 15896 1727203890.56852: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.068) 0:00:36.158 ***** 15896 1727203890.56877: entering _queue_task() for managed-node1/dnf 15896 1727203890.57132: worker is 1 (out of 1 available) 15896 1727203890.57145: exiting _queue_task() for managed-node1/dnf 15896 1727203890.57157: done queuing things up, now waiting for results queue to drain 15896 1727203890.57159: waiting for pending results... 15896 1727203890.57343: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203890.57447: in run() - task 028d2410-947f-fb83-b6ad-0000000000d4 15896 1727203890.57458: variable 'ansible_search_path' from source: unknown 15896 1727203890.57461: variable 'ansible_search_path' from source: unknown 15896 1727203890.57496: calling self._execute() 15896 1727203890.57579: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.57582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.57591: variable 'omit' from source: magic vars 15896 1727203890.57879: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.57888: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.58023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203890.59544: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203890.59593: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203890.59620: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203890.59645: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203890.59668: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203890.59725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.59755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.59779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.59805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.59816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.59903: variable 'ansible_distribution' from source: facts 15896 1727203890.59907: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.59919: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15896 1727203890.59999: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203890.60082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.60102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.60118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.60143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.60153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.60185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.60205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.60221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.60245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.60255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.60287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.60303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.60321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.60345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.60355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.60455: variable 'network_connections' from source: task vars 15896 1727203890.60468: variable 'controller_profile' from source: play vars 15896 1727203890.60512: variable 'controller_profile' from source: play vars 15896 1727203890.60520: variable 'controller_device' from source: play vars 15896 1727203890.60563: variable 'controller_device' from source: play vars 15896 1727203890.60573: variable 'port1_profile' from source: play vars 15896 1727203890.60615: variable 'port1_profile' from source: play vars 15896 1727203890.60621: variable 'dhcp_interface1' from source: play vars 15896 1727203890.60667: variable 'dhcp_interface1' from source: play vars 15896 1727203890.60672: variable 'controller_profile' from source: play vars 15896 1727203890.60715: variable 'controller_profile' from source: play vars 15896 1727203890.60721: variable 'port2_profile' from source: play vars 15896 1727203890.60765: variable 'port2_profile' from source: play vars 15896 1727203890.60772: variable 'dhcp_interface2' from source: play vars 15896 1727203890.60814: variable 'dhcp_interface2' from source: play vars 15896 1727203890.60820: variable 'controller_profile' from source: play vars 15896 1727203890.60861: variable 'controller_profile' from source: play vars 15896 1727203890.60912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203890.61021: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203890.61047: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203890.61073: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203890.61102: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203890.61132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203890.61157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203890.61191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.61202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203890.61246: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203890.61392: variable 'network_connections' from source: task vars 15896 1727203890.61395: variable 'controller_profile' from source: play vars 15896 1727203890.61438: variable 'controller_profile' from source: play vars 15896 1727203890.61444: variable 'controller_device' from source: play vars 15896 1727203890.61487: variable 'controller_device' from source: play vars 15896 1727203890.61494: variable 'port1_profile' from source: play vars 15896 1727203890.61538: variable 'port1_profile' from source: play vars 15896 1727203890.61544: variable 'dhcp_interface1' from source: play vars 15896 1727203890.61586: variable 'dhcp_interface1' from source: play vars 15896 1727203890.61592: variable 'controller_profile' from source: play vars 15896 1727203890.61632: variable 'controller_profile' from source: play vars 15896 1727203890.61643: variable 'port2_profile' from source: play vars 15896 1727203890.61687: variable 'port2_profile' from source: play vars 15896 1727203890.61692: variable 'dhcp_interface2' from source: play vars 15896 1727203890.61735: variable 'dhcp_interface2' from source: play vars 15896 1727203890.61741: variable 'controller_profile' from source: play vars 15896 1727203890.61786: variable 'controller_profile' from source: play vars 15896 1727203890.61809: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203890.61812: when evaluation is False, skipping this task 15896 1727203890.61815: _execute() done 15896 1727203890.61818: dumping result to json 15896 1727203890.61820: done dumping result, returning 15896 1727203890.61828: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-0000000000d4] 15896 1727203890.61833: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d4 15896 1727203890.61922: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d4 15896 1727203890.61924: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203890.62003: no more pending results, returning what we have 15896 1727203890.62007: results queue empty 15896 1727203890.62007: checking for any_errors_fatal 15896 1727203890.62013: done checking for any_errors_fatal 15896 1727203890.62014: checking for max_fail_percentage 15896 1727203890.62016: done checking for max_fail_percentage 15896 1727203890.62017: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.62018: done checking to see if all hosts have failed 15896 1727203890.62018: getting the remaining hosts for this loop 15896 1727203890.62020: done getting the remaining hosts for this loop 15896 1727203890.62023: getting the next task for host managed-node1 15896 1727203890.62030: done getting next task for host managed-node1 15896 1727203890.62033: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203890.62036: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.62055: getting variables 15896 1727203890.62056: in VariableManager get_vars() 15896 1727203890.62106: Calling all_inventory to load vars for managed-node1 15896 1727203890.62108: Calling groups_inventory to load vars for managed-node1 15896 1727203890.62110: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.62120: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.62122: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.62124: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.62905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.63855: done with get_vars() 15896 1727203890.63871: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203890.63926: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.070) 0:00:36.228 ***** 15896 1727203890.63947: entering _queue_task() for managed-node1/yum 15896 1727203890.64186: worker is 1 (out of 1 available) 15896 1727203890.64199: exiting _queue_task() for managed-node1/yum 15896 1727203890.64210: done queuing things up, now waiting for results queue to drain 15896 1727203890.64212: waiting for pending results... 15896 1727203890.64394: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203890.64495: in run() - task 028d2410-947f-fb83-b6ad-0000000000d5 15896 1727203890.64507: variable 'ansible_search_path' from source: unknown 15896 1727203890.64510: variable 'ansible_search_path' from source: unknown 15896 1727203890.64538: calling self._execute() 15896 1727203890.64625: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.64628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.64637: variable 'omit' from source: magic vars 15896 1727203890.64917: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.64926: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.65044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203890.66539: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203890.66586: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203890.66615: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203890.66640: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203890.66659: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203890.66720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.66750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.66771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.66798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.66808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.66879: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.66891: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15896 1727203890.66893: when evaluation is False, skipping this task 15896 1727203890.66896: _execute() done 15896 1727203890.66900: dumping result to json 15896 1727203890.66903: done dumping result, returning 15896 1727203890.66911: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-0000000000d5] 15896 1727203890.66915: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d5 15896 1727203890.67005: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d5 15896 1727203890.67008: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15896 1727203890.67086: no more pending results, returning what we have 15896 1727203890.67090: results queue empty 15896 1727203890.67090: checking for any_errors_fatal 15896 1727203890.67096: done checking for any_errors_fatal 15896 1727203890.67097: checking for max_fail_percentage 15896 1727203890.67099: done checking for max_fail_percentage 15896 1727203890.67100: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.67101: done checking to see if all hosts have failed 15896 1727203890.67101: getting the remaining hosts for this loop 15896 1727203890.67103: done getting the remaining hosts for this loop 15896 1727203890.67106: getting the next task for host managed-node1 15896 1727203890.67111: done getting next task for host managed-node1 15896 1727203890.67115: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203890.67118: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.67139: getting variables 15896 1727203890.67141: in VariableManager get_vars() 15896 1727203890.67189: Calling all_inventory to load vars for managed-node1 15896 1727203890.67192: Calling groups_inventory to load vars for managed-node1 15896 1727203890.67194: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.67203: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.67206: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.67208: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.67989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.68855: done with get_vars() 15896 1727203890.68878: done getting variables 15896 1727203890.68923: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.049) 0:00:36.278 ***** 15896 1727203890.68948: entering _queue_task() for managed-node1/fail 15896 1727203890.69218: worker is 1 (out of 1 available) 15896 1727203890.69231: exiting _queue_task() for managed-node1/fail 15896 1727203890.69245: done queuing things up, now waiting for results queue to drain 15896 1727203890.69246: waiting for pending results... 15896 1727203890.69448: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203890.69554: in run() - task 028d2410-947f-fb83-b6ad-0000000000d6 15896 1727203890.69567: variable 'ansible_search_path' from source: unknown 15896 1727203890.69571: variable 'ansible_search_path' from source: unknown 15896 1727203890.69608: calling self._execute() 15896 1727203890.69694: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.69697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.69705: variable 'omit' from source: magic vars 15896 1727203890.69994: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.70003: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.70085: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203890.70212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203890.71710: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203890.71757: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203890.71786: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203890.71812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203890.71831: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203890.71896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.72190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.72209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.72236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.72247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.72286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.72305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.72321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.72346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.72356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.72388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.72406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.72423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.72446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.72457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.72573: variable 'network_connections' from source: task vars 15896 1727203890.72586: variable 'controller_profile' from source: play vars 15896 1727203890.72638: variable 'controller_profile' from source: play vars 15896 1727203890.72647: variable 'controller_device' from source: play vars 15896 1727203890.72693: variable 'controller_device' from source: play vars 15896 1727203890.72701: variable 'port1_profile' from source: play vars 15896 1727203890.72744: variable 'port1_profile' from source: play vars 15896 1727203890.72750: variable 'dhcp_interface1' from source: play vars 15896 1727203890.72793: variable 'dhcp_interface1' from source: play vars 15896 1727203890.72799: variable 'controller_profile' from source: play vars 15896 1727203890.72840: variable 'controller_profile' from source: play vars 15896 1727203890.72846: variable 'port2_profile' from source: play vars 15896 1727203890.72890: variable 'port2_profile' from source: play vars 15896 1727203890.72896: variable 'dhcp_interface2' from source: play vars 15896 1727203890.72936: variable 'dhcp_interface2' from source: play vars 15896 1727203890.72941: variable 'controller_profile' from source: play vars 15896 1727203890.72986: variable 'controller_profile' from source: play vars 15896 1727203890.73036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203890.73158: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203890.73192: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203890.73215: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203890.73237: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203890.73271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203890.73290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203890.73307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.73324: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203890.73378: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203890.73530: variable 'network_connections' from source: task vars 15896 1727203890.73534: variable 'controller_profile' from source: play vars 15896 1727203890.73579: variable 'controller_profile' from source: play vars 15896 1727203890.73585: variable 'controller_device' from source: play vars 15896 1727203890.73628: variable 'controller_device' from source: play vars 15896 1727203890.73636: variable 'port1_profile' from source: play vars 15896 1727203890.73678: variable 'port1_profile' from source: play vars 15896 1727203890.73684: variable 'dhcp_interface1' from source: play vars 15896 1727203890.73739: variable 'dhcp_interface1' from source: play vars 15896 1727203890.73744: variable 'controller_profile' from source: play vars 15896 1727203890.73788: variable 'controller_profile' from source: play vars 15896 1727203890.73794: variable 'port2_profile' from source: play vars 15896 1727203890.73837: variable 'port2_profile' from source: play vars 15896 1727203890.73843: variable 'dhcp_interface2' from source: play vars 15896 1727203890.73887: variable 'dhcp_interface2' from source: play vars 15896 1727203890.73892: variable 'controller_profile' from source: play vars 15896 1727203890.73935: variable 'controller_profile' from source: play vars 15896 1727203890.73964: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203890.73967: when evaluation is False, skipping this task 15896 1727203890.73970: _execute() done 15896 1727203890.73972: dumping result to json 15896 1727203890.73974: done dumping result, returning 15896 1727203890.73983: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-0000000000d6] 15896 1727203890.73988: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d6 15896 1727203890.74082: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d6 15896 1727203890.74085: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203890.74132: no more pending results, returning what we have 15896 1727203890.74135: results queue empty 15896 1727203890.74136: checking for any_errors_fatal 15896 1727203890.74142: done checking for any_errors_fatal 15896 1727203890.74142: checking for max_fail_percentage 15896 1727203890.74144: done checking for max_fail_percentage 15896 1727203890.74145: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.74146: done checking to see if all hosts have failed 15896 1727203890.74146: getting the remaining hosts for this loop 15896 1727203890.74148: done getting the remaining hosts for this loop 15896 1727203890.74151: getting the next task for host managed-node1 15896 1727203890.74157: done getting next task for host managed-node1 15896 1727203890.74163: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15896 1727203890.74166: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.74188: getting variables 15896 1727203890.74190: in VariableManager get_vars() 15896 1727203890.74243: Calling all_inventory to load vars for managed-node1 15896 1727203890.74246: Calling groups_inventory to load vars for managed-node1 15896 1727203890.74248: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.74259: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.74264: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.74267: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.75230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.76098: done with get_vars() 15896 1727203890.76124: done getting variables 15896 1727203890.76171: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.072) 0:00:36.351 ***** 15896 1727203890.76199: entering _queue_task() for managed-node1/package 15896 1727203890.76478: worker is 1 (out of 1 available) 15896 1727203890.76490: exiting _queue_task() for managed-node1/package 15896 1727203890.76502: done queuing things up, now waiting for results queue to drain 15896 1727203890.76504: waiting for pending results... 15896 1727203890.76693: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 15896 1727203890.76793: in run() - task 028d2410-947f-fb83-b6ad-0000000000d7 15896 1727203890.76805: variable 'ansible_search_path' from source: unknown 15896 1727203890.76808: variable 'ansible_search_path' from source: unknown 15896 1727203890.76842: calling self._execute() 15896 1727203890.76921: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.76924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.76933: variable 'omit' from source: magic vars 15896 1727203890.77220: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.77229: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.77364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203890.77557: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203890.77595: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203890.77622: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203890.77676: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203890.77759: variable 'network_packages' from source: role '' defaults 15896 1727203890.77847: variable '__network_provider_setup' from source: role '' defaults 15896 1727203890.77857: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203890.77911: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203890.77915: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203890.77963: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203890.78081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203890.79421: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203890.79467: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203890.79497: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203890.79520: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203890.79540: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203890.79601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.79622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.79639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.79668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.79680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.79712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.79728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.79744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.79771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.79784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.79929: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203890.80006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.80022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.80038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.80065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.80073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.80138: variable 'ansible_python' from source: facts 15896 1727203890.80158: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203890.80217: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203890.80272: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203890.80367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.80386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.80402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.80428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.80439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.80470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203890.80491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203890.80507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.80536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203890.80543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203890.80638: variable 'network_connections' from source: task vars 15896 1727203890.80642: variable 'controller_profile' from source: play vars 15896 1727203890.80713: variable 'controller_profile' from source: play vars 15896 1727203890.80722: variable 'controller_device' from source: play vars 15896 1727203890.80791: variable 'controller_device' from source: play vars 15896 1727203890.80801: variable 'port1_profile' from source: play vars 15896 1727203890.80871: variable 'port1_profile' from source: play vars 15896 1727203890.80875: variable 'dhcp_interface1' from source: play vars 15896 1727203890.80940: variable 'dhcp_interface1' from source: play vars 15896 1727203890.80947: variable 'controller_profile' from source: play vars 15896 1727203890.81016: variable 'controller_profile' from source: play vars 15896 1727203890.81024: variable 'port2_profile' from source: play vars 15896 1727203890.81094: variable 'port2_profile' from source: play vars 15896 1727203890.81102: variable 'dhcp_interface2' from source: play vars 15896 1727203890.81164: variable 'dhcp_interface2' from source: play vars 15896 1727203890.81179: variable 'controller_profile' from source: play vars 15896 1727203890.81245: variable 'controller_profile' from source: play vars 15896 1727203890.81302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203890.81321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203890.81342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203890.81365: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203890.81403: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203890.81580: variable 'network_connections' from source: task vars 15896 1727203890.81583: variable 'controller_profile' from source: play vars 15896 1727203890.81652: variable 'controller_profile' from source: play vars 15896 1727203890.81659: variable 'controller_device' from source: play vars 15896 1727203890.81726: variable 'controller_device' from source: play vars 15896 1727203890.81735: variable 'port1_profile' from source: play vars 15896 1727203890.81809: variable 'port1_profile' from source: play vars 15896 1727203890.81816: variable 'dhcp_interface1' from source: play vars 15896 1727203890.81898: variable 'dhcp_interface1' from source: play vars 15896 1727203890.81901: variable 'controller_profile' from source: play vars 15896 1727203890.81970: variable 'controller_profile' from source: play vars 15896 1727203890.81978: variable 'port2_profile' from source: play vars 15896 1727203890.82046: variable 'port2_profile' from source: play vars 15896 1727203890.82053: variable 'dhcp_interface2' from source: play vars 15896 1727203890.82124: variable 'dhcp_interface2' from source: play vars 15896 1727203890.82131: variable 'controller_profile' from source: play vars 15896 1727203890.82201: variable 'controller_profile' from source: play vars 15896 1727203890.82239: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203890.82299: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203890.82490: variable 'network_connections' from source: task vars 15896 1727203890.82494: variable 'controller_profile' from source: play vars 15896 1727203890.82539: variable 'controller_profile' from source: play vars 15896 1727203890.82545: variable 'controller_device' from source: play vars 15896 1727203890.82593: variable 'controller_device' from source: play vars 15896 1727203890.82600: variable 'port1_profile' from source: play vars 15896 1727203890.82645: variable 'port1_profile' from source: play vars 15896 1727203890.82651: variable 'dhcp_interface1' from source: play vars 15896 1727203890.82699: variable 'dhcp_interface1' from source: play vars 15896 1727203890.82705: variable 'controller_profile' from source: play vars 15896 1727203890.82750: variable 'controller_profile' from source: play vars 15896 1727203890.82756: variable 'port2_profile' from source: play vars 15896 1727203890.82803: variable 'port2_profile' from source: play vars 15896 1727203890.82809: variable 'dhcp_interface2' from source: play vars 15896 1727203890.82855: variable 'dhcp_interface2' from source: play vars 15896 1727203890.82860: variable 'controller_profile' from source: play vars 15896 1727203890.82908: variable 'controller_profile' from source: play vars 15896 1727203890.82926: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203890.82984: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203890.83178: variable 'network_connections' from source: task vars 15896 1727203890.83182: variable 'controller_profile' from source: play vars 15896 1727203890.83225: variable 'controller_profile' from source: play vars 15896 1727203890.83231: variable 'controller_device' from source: play vars 15896 1727203890.83279: variable 'controller_device' from source: play vars 15896 1727203890.83287: variable 'port1_profile' from source: play vars 15896 1727203890.83330: variable 'port1_profile' from source: play vars 15896 1727203890.83336: variable 'dhcp_interface1' from source: play vars 15896 1727203890.83384: variable 'dhcp_interface1' from source: play vars 15896 1727203890.83390: variable 'controller_profile' from source: play vars 15896 1727203890.83434: variable 'controller_profile' from source: play vars 15896 1727203890.83439: variable 'port2_profile' from source: play vars 15896 1727203890.83488: variable 'port2_profile' from source: play vars 15896 1727203890.83494: variable 'dhcp_interface2' from source: play vars 15896 1727203890.83537: variable 'dhcp_interface2' from source: play vars 15896 1727203890.83543: variable 'controller_profile' from source: play vars 15896 1727203890.83593: variable 'controller_profile' from source: play vars 15896 1727203890.83636: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203890.83680: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203890.83686: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203890.83728: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203890.83864: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203890.84178: variable 'network_connections' from source: task vars 15896 1727203890.84182: variable 'controller_profile' from source: play vars 15896 1727203890.84223: variable 'controller_profile' from source: play vars 15896 1727203890.84231: variable 'controller_device' from source: play vars 15896 1727203890.84274: variable 'controller_device' from source: play vars 15896 1727203890.84283: variable 'port1_profile' from source: play vars 15896 1727203890.84323: variable 'port1_profile' from source: play vars 15896 1727203890.84329: variable 'dhcp_interface1' from source: play vars 15896 1727203890.84373: variable 'dhcp_interface1' from source: play vars 15896 1727203890.84380: variable 'controller_profile' from source: play vars 15896 1727203890.84420: variable 'controller_profile' from source: play vars 15896 1727203890.84426: variable 'port2_profile' from source: play vars 15896 1727203890.84470: variable 'port2_profile' from source: play vars 15896 1727203890.84477: variable 'dhcp_interface2' from source: play vars 15896 1727203890.84517: variable 'dhcp_interface2' from source: play vars 15896 1727203890.84523: variable 'controller_profile' from source: play vars 15896 1727203890.84563: variable 'controller_profile' from source: play vars 15896 1727203890.84572: variable 'ansible_distribution' from source: facts 15896 1727203890.84581: variable '__network_rh_distros' from source: role '' defaults 15896 1727203890.84587: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.84606: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203890.84717: variable 'ansible_distribution' from source: facts 15896 1727203890.84721: variable '__network_rh_distros' from source: role '' defaults 15896 1727203890.84726: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.84739: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203890.84848: variable 'ansible_distribution' from source: facts 15896 1727203890.84851: variable '__network_rh_distros' from source: role '' defaults 15896 1727203890.84856: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.84885: variable 'network_provider' from source: set_fact 15896 1727203890.84899: variable 'ansible_facts' from source: unknown 15896 1727203890.85324: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15896 1727203890.85329: when evaluation is False, skipping this task 15896 1727203890.85331: _execute() done 15896 1727203890.85334: dumping result to json 15896 1727203890.85336: done dumping result, returning 15896 1727203890.85343: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-fb83-b6ad-0000000000d7] 15896 1727203890.85348: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d7 15896 1727203890.85436: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d7 15896 1727203890.85439: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15896 1727203890.85495: no more pending results, returning what we have 15896 1727203890.85498: results queue empty 15896 1727203890.85499: checking for any_errors_fatal 15896 1727203890.85506: done checking for any_errors_fatal 15896 1727203890.85507: checking for max_fail_percentage 15896 1727203890.85508: done checking for max_fail_percentage 15896 1727203890.85509: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.85510: done checking to see if all hosts have failed 15896 1727203890.85511: getting the remaining hosts for this loop 15896 1727203890.85512: done getting the remaining hosts for this loop 15896 1727203890.85515: getting the next task for host managed-node1 15896 1727203890.85522: done getting next task for host managed-node1 15896 1727203890.85525: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203890.85528: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.85547: getting variables 15896 1727203890.85549: in VariableManager get_vars() 15896 1727203890.85604: Calling all_inventory to load vars for managed-node1 15896 1727203890.85607: Calling groups_inventory to load vars for managed-node1 15896 1727203890.85609: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.85619: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.85621: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.85624: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.86448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.87307: done with get_vars() 15896 1727203890.87325: done getting variables 15896 1727203890.87368: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.111) 0:00:36.463 ***** 15896 1727203890.87395: entering _queue_task() for managed-node1/package 15896 1727203890.87645: worker is 1 (out of 1 available) 15896 1727203890.87657: exiting _queue_task() for managed-node1/package 15896 1727203890.87671: done queuing things up, now waiting for results queue to drain 15896 1727203890.87672: waiting for pending results... 15896 1727203890.87859: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203890.87953: in run() - task 028d2410-947f-fb83-b6ad-0000000000d8 15896 1727203890.87966: variable 'ansible_search_path' from source: unknown 15896 1727203890.87970: variable 'ansible_search_path' from source: unknown 15896 1727203890.88001: calling self._execute() 15896 1727203890.88087: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.88090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.88100: variable 'omit' from source: magic vars 15896 1727203890.88381: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.88391: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.88474: variable 'network_state' from source: role '' defaults 15896 1727203890.88484: Evaluated conditional (network_state != {}): False 15896 1727203890.88487: when evaluation is False, skipping this task 15896 1727203890.88490: _execute() done 15896 1727203890.88492: dumping result to json 15896 1727203890.88495: done dumping result, returning 15896 1727203890.88503: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-fb83-b6ad-0000000000d8] 15896 1727203890.88509: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d8 15896 1727203890.88601: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d8 15896 1727203890.88604: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203890.88648: no more pending results, returning what we have 15896 1727203890.88653: results queue empty 15896 1727203890.88654: checking for any_errors_fatal 15896 1727203890.88661: done checking for any_errors_fatal 15896 1727203890.88661: checking for max_fail_percentage 15896 1727203890.88663: done checking for max_fail_percentage 15896 1727203890.88664: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.88665: done checking to see if all hosts have failed 15896 1727203890.88665: getting the remaining hosts for this loop 15896 1727203890.88667: done getting the remaining hosts for this loop 15896 1727203890.88670: getting the next task for host managed-node1 15896 1727203890.88678: done getting next task for host managed-node1 15896 1727203890.88681: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203890.88684: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.88706: getting variables 15896 1727203890.88708: in VariableManager get_vars() 15896 1727203890.88755: Calling all_inventory to load vars for managed-node1 15896 1727203890.88758: Calling groups_inventory to load vars for managed-node1 15896 1727203890.88760: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.88769: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.88772: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.88774: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.89674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.90552: done with get_vars() 15896 1727203890.90577: done getting variables 15896 1727203890.90626: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.032) 0:00:36.495 ***** 15896 1727203890.90650: entering _queue_task() for managed-node1/package 15896 1727203890.90924: worker is 1 (out of 1 available) 15896 1727203890.90935: exiting _queue_task() for managed-node1/package 15896 1727203890.90948: done queuing things up, now waiting for results queue to drain 15896 1727203890.90950: waiting for pending results... 15896 1727203890.91222: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203890.91319: in run() - task 028d2410-947f-fb83-b6ad-0000000000d9 15896 1727203890.91328: variable 'ansible_search_path' from source: unknown 15896 1727203890.91331: variable 'ansible_search_path' from source: unknown 15896 1727203890.91361: calling self._execute() 15896 1727203890.91447: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.91458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.91469: variable 'omit' from source: magic vars 15896 1727203890.91753: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.91763: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.91847: variable 'network_state' from source: role '' defaults 15896 1727203890.91857: Evaluated conditional (network_state != {}): False 15896 1727203890.91860: when evaluation is False, skipping this task 15896 1727203890.91868: _execute() done 15896 1727203890.91871: dumping result to json 15896 1727203890.91878: done dumping result, returning 15896 1727203890.91885: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-fb83-b6ad-0000000000d9] 15896 1727203890.91887: sending task result for task 028d2410-947f-fb83-b6ad-0000000000d9 15896 1727203890.91980: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000d9 15896 1727203890.91982: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203890.92033: no more pending results, returning what we have 15896 1727203890.92037: results queue empty 15896 1727203890.92038: checking for any_errors_fatal 15896 1727203890.92044: done checking for any_errors_fatal 15896 1727203890.92045: checking for max_fail_percentage 15896 1727203890.92047: done checking for max_fail_percentage 15896 1727203890.92047: checking to see if all hosts have failed and the running result is not ok 15896 1727203890.92048: done checking to see if all hosts have failed 15896 1727203890.92049: getting the remaining hosts for this loop 15896 1727203890.92050: done getting the remaining hosts for this loop 15896 1727203890.92054: getting the next task for host managed-node1 15896 1727203890.92061: done getting next task for host managed-node1 15896 1727203890.92064: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203890.92068: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203890.92093: getting variables 15896 1727203890.92095: in VariableManager get_vars() 15896 1727203890.92145: Calling all_inventory to load vars for managed-node1 15896 1727203890.92148: Calling groups_inventory to load vars for managed-node1 15896 1727203890.92150: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203890.92160: Calling all_plugins_play to load vars for managed-node1 15896 1727203890.92162: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203890.92165: Calling groups_plugins_play to load vars for managed-node1 15896 1727203890.93223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203890.94797: done with get_vars() 15896 1727203890.94816: done getting variables 15896 1727203890.94861: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:51:30 -0400 (0:00:00.042) 0:00:36.538 ***** 15896 1727203890.94889: entering _queue_task() for managed-node1/service 15896 1727203890.95151: worker is 1 (out of 1 available) 15896 1727203890.95163: exiting _queue_task() for managed-node1/service 15896 1727203890.95177: done queuing things up, now waiting for results queue to drain 15896 1727203890.95179: waiting for pending results... 15896 1727203890.95498: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203890.95545: in run() - task 028d2410-947f-fb83-b6ad-0000000000da 15896 1727203890.95570: variable 'ansible_search_path' from source: unknown 15896 1727203890.95581: variable 'ansible_search_path' from source: unknown 15896 1727203890.95623: calling self._execute() 15896 1727203890.95738: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203890.95750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203890.95770: variable 'omit' from source: magic vars 15896 1727203890.96182: variable 'ansible_distribution_major_version' from source: facts 15896 1727203890.96201: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203890.96334: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203890.96915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203891.01528: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203891.01607: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203891.01650: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203891.01705: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203891.01781: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203891.01858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203891.01915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203891.01948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.01999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203891.02019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203891.02070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203891.02180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203891.02184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.02186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203891.02192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203891.02237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203891.02265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203891.02300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.02382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203891.02385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203891.02546: variable 'network_connections' from source: task vars 15896 1727203891.02563: variable 'controller_profile' from source: play vars 15896 1727203891.02654: variable 'controller_profile' from source: play vars 15896 1727203891.02683: variable 'controller_device' from source: play vars 15896 1727203891.02758: variable 'controller_device' from source: play vars 15896 1727203891.02846: variable 'port1_profile' from source: play vars 15896 1727203891.02850: variable 'port1_profile' from source: play vars 15896 1727203891.02857: variable 'dhcp_interface1' from source: play vars 15896 1727203891.02918: variable 'dhcp_interface1' from source: play vars 15896 1727203891.02931: variable 'controller_profile' from source: play vars 15896 1727203891.03001: variable 'controller_profile' from source: play vars 15896 1727203891.03014: variable 'port2_profile' from source: play vars 15896 1727203891.03083: variable 'port2_profile' from source: play vars 15896 1727203891.03095: variable 'dhcp_interface2' from source: play vars 15896 1727203891.03162: variable 'dhcp_interface2' from source: play vars 15896 1727203891.03180: variable 'controller_profile' from source: play vars 15896 1727203891.03241: variable 'controller_profile' from source: play vars 15896 1727203891.03318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203891.03581: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203891.03584: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203891.03586: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203891.03613: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203891.03663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203891.03710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203891.03738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.03809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203891.03985: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203891.04619: variable 'network_connections' from source: task vars 15896 1727203891.04679: variable 'controller_profile' from source: play vars 15896 1727203891.04751: variable 'controller_profile' from source: play vars 15896 1727203891.04764: variable 'controller_device' from source: play vars 15896 1727203891.04830: variable 'controller_device' from source: play vars 15896 1727203891.04848: variable 'port1_profile' from source: play vars 15896 1727203891.04911: variable 'port1_profile' from source: play vars 15896 1727203891.04924: variable 'dhcp_interface1' from source: play vars 15896 1727203891.04994: variable 'dhcp_interface1' from source: play vars 15896 1727203891.05028: variable 'controller_profile' from source: play vars 15896 1727203891.05500: variable 'controller_profile' from source: play vars 15896 1727203891.05504: variable 'port2_profile' from source: play vars 15896 1727203891.05506: variable 'port2_profile' from source: play vars 15896 1727203891.05508: variable 'dhcp_interface2' from source: play vars 15896 1727203891.05510: variable 'dhcp_interface2' from source: play vars 15896 1727203891.05512: variable 'controller_profile' from source: play vars 15896 1727203891.05620: variable 'controller_profile' from source: play vars 15896 1727203891.05661: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203891.05721: when evaluation is False, skipping this task 15896 1727203891.05729: _execute() done 15896 1727203891.05759: dumping result to json 15896 1727203891.05831: done dumping result, returning 15896 1727203891.05845: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-0000000000da] 15896 1727203891.05857: sending task result for task 028d2410-947f-fb83-b6ad-0000000000da skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203891.06008: no more pending results, returning what we have 15896 1727203891.06012: results queue empty 15896 1727203891.06013: checking for any_errors_fatal 15896 1727203891.06023: done checking for any_errors_fatal 15896 1727203891.06023: checking for max_fail_percentage 15896 1727203891.06025: done checking for max_fail_percentage 15896 1727203891.06026: checking to see if all hosts have failed and the running result is not ok 15896 1727203891.06027: done checking to see if all hosts have failed 15896 1727203891.06028: getting the remaining hosts for this loop 15896 1727203891.06029: done getting the remaining hosts for this loop 15896 1727203891.06033: getting the next task for host managed-node1 15896 1727203891.06040: done getting next task for host managed-node1 15896 1727203891.06043: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203891.06046: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203891.06067: getting variables 15896 1727203891.06069: in VariableManager get_vars() 15896 1727203891.06127: Calling all_inventory to load vars for managed-node1 15896 1727203891.06131: Calling groups_inventory to load vars for managed-node1 15896 1727203891.06133: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203891.06144: Calling all_plugins_play to load vars for managed-node1 15896 1727203891.06148: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203891.06151: Calling groups_plugins_play to load vars for managed-node1 15896 1727203891.06904: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000da 15896 1727203891.06908: WORKER PROCESS EXITING 15896 1727203891.09357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203891.12861: done with get_vars() 15896 1727203891.13104: done getting variables 15896 1727203891.13166: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:51:31 -0400 (0:00:00.183) 0:00:36.721 ***** 15896 1727203891.13204: entering _queue_task() for managed-node1/service 15896 1727203891.13929: worker is 1 (out of 1 available) 15896 1727203891.13942: exiting _queue_task() for managed-node1/service 15896 1727203891.13955: done queuing things up, now waiting for results queue to drain 15896 1727203891.13957: waiting for pending results... 15896 1727203891.14594: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203891.15041: in run() - task 028d2410-947f-fb83-b6ad-0000000000db 15896 1727203891.15045: variable 'ansible_search_path' from source: unknown 15896 1727203891.15047: variable 'ansible_search_path' from source: unknown 15896 1727203891.15088: calling self._execute() 15896 1727203891.15306: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203891.15353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203891.15374: variable 'omit' from source: magic vars 15896 1727203891.15920: variable 'ansible_distribution_major_version' from source: facts 15896 1727203891.15937: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203891.16126: variable 'network_provider' from source: set_fact 15896 1727203891.16138: variable 'network_state' from source: role '' defaults 15896 1727203891.16153: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15896 1727203891.16164: variable 'omit' from source: magic vars 15896 1727203891.16230: variable 'omit' from source: magic vars 15896 1727203891.16267: variable 'network_service_name' from source: role '' defaults 15896 1727203891.16345: variable 'network_service_name' from source: role '' defaults 15896 1727203891.16562: variable '__network_provider_setup' from source: role '' defaults 15896 1727203891.16565: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203891.16567: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203891.16569: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203891.16604: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203891.16835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203891.19447: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203891.19526: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203891.19568: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203891.19624: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203891.19654: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203891.19742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203891.19779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203891.19810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.19861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203891.19884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203891.19939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203891.19968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203891.20002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.20049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203891.20148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203891.20316: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203891.20448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203891.20488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203891.20518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.20565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203891.20591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203891.20686: variable 'ansible_python' from source: facts 15896 1727203891.20801: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203891.20810: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203891.20894: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203891.21030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203891.21063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203891.21096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.21143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203891.21166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203891.21222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203891.21351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203891.21354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.21395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203891.21439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203891.21890: variable 'network_connections' from source: task vars 15896 1727203891.21893: variable 'controller_profile' from source: play vars 15896 1727203891.21930: variable 'controller_profile' from source: play vars 15896 1727203891.22080: variable 'controller_device' from source: play vars 15896 1727203891.22083: variable 'controller_device' from source: play vars 15896 1727203891.22323: variable 'port1_profile' from source: play vars 15896 1727203891.22325: variable 'port1_profile' from source: play vars 15896 1727203891.22327: variable 'dhcp_interface1' from source: play vars 15896 1727203891.22546: variable 'dhcp_interface1' from source: play vars 15896 1727203891.22565: variable 'controller_profile' from source: play vars 15896 1727203891.22640: variable 'controller_profile' from source: play vars 15896 1727203891.22766: variable 'port2_profile' from source: play vars 15896 1727203891.22836: variable 'port2_profile' from source: play vars 15896 1727203891.23180: variable 'dhcp_interface2' from source: play vars 15896 1727203891.23184: variable 'dhcp_interface2' from source: play vars 15896 1727203891.23186: variable 'controller_profile' from source: play vars 15896 1727203891.23188: variable 'controller_profile' from source: play vars 15896 1727203891.23448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203891.23807: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203891.23920: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203891.24515: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203891.24594: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203891.24665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203891.24706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203891.24742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203891.24779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203891.24835: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203891.25147: variable 'network_connections' from source: task vars 15896 1727203891.25157: variable 'controller_profile' from source: play vars 15896 1727203891.25246: variable 'controller_profile' from source: play vars 15896 1727203891.25265: variable 'controller_device' from source: play vars 15896 1727203891.25341: variable 'controller_device' from source: play vars 15896 1727203891.25357: variable 'port1_profile' from source: play vars 15896 1727203891.25432: variable 'port1_profile' from source: play vars 15896 1727203891.25450: variable 'dhcp_interface1' from source: play vars 15896 1727203891.25523: variable 'dhcp_interface1' from source: play vars 15896 1727203891.25538: variable 'controller_profile' from source: play vars 15896 1727203891.25612: variable 'controller_profile' from source: play vars 15896 1727203891.25625: variable 'port2_profile' from source: play vars 15896 1727203891.25701: variable 'port2_profile' from source: play vars 15896 1727203891.25716: variable 'dhcp_interface2' from source: play vars 15896 1727203891.25794: variable 'dhcp_interface2' from source: play vars 15896 1727203891.25810: variable 'controller_profile' from source: play vars 15896 1727203891.25889: variable 'controller_profile' from source: play vars 15896 1727203891.25945: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203891.26035: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203891.26754: variable 'network_connections' from source: task vars 15896 1727203891.26758: variable 'controller_profile' from source: play vars 15896 1727203891.26829: variable 'controller_profile' from source: play vars 15896 1727203891.26842: variable 'controller_device' from source: play vars 15896 1727203891.26928: variable 'controller_device' from source: play vars 15896 1727203891.27090: variable 'port1_profile' from source: play vars 15896 1727203891.27164: variable 'port1_profile' from source: play vars 15896 1727203891.27383: variable 'dhcp_interface1' from source: play vars 15896 1727203891.27386: variable 'dhcp_interface1' from source: play vars 15896 1727203891.27388: variable 'controller_profile' from source: play vars 15896 1727203891.27541: variable 'controller_profile' from source: play vars 15896 1727203891.27552: variable 'port2_profile' from source: play vars 15896 1727203891.27715: variable 'port2_profile' from source: play vars 15896 1727203891.27727: variable 'dhcp_interface2' from source: play vars 15896 1727203891.27835: variable 'dhcp_interface2' from source: play vars 15896 1727203891.27890: variable 'controller_profile' from source: play vars 15896 1727203891.28068: variable 'controller_profile' from source: play vars 15896 1727203891.28112: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203891.28310: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203891.28768: variable 'network_connections' from source: task vars 15896 1727203891.28783: variable 'controller_profile' from source: play vars 15896 1727203891.28869: variable 'controller_profile' from source: play vars 15896 1727203891.28889: variable 'controller_device' from source: play vars 15896 1727203891.28973: variable 'controller_device' from source: play vars 15896 1727203891.28991: variable 'port1_profile' from source: play vars 15896 1727203891.29088: variable 'port1_profile' from source: play vars 15896 1727203891.29112: variable 'dhcp_interface1' from source: play vars 15896 1727203891.29391: variable 'dhcp_interface1' from source: play vars 15896 1727203891.29395: variable 'controller_profile' from source: play vars 15896 1727203891.29575: variable 'controller_profile' from source: play vars 15896 1727203891.29678: variable 'port2_profile' from source: play vars 15896 1727203891.29681: variable 'port2_profile' from source: play vars 15896 1727203891.29683: variable 'dhcp_interface2' from source: play vars 15896 1727203891.29771: variable 'dhcp_interface2' from source: play vars 15896 1727203891.29786: variable 'controller_profile' from source: play vars 15896 1727203891.29859: variable 'controller_profile' from source: play vars 15896 1727203891.29942: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203891.30018: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203891.30029: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203891.30093: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203891.30327: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203891.31007: variable 'network_connections' from source: task vars 15896 1727203891.31011: variable 'controller_profile' from source: play vars 15896 1727203891.31042: variable 'controller_profile' from source: play vars 15896 1727203891.31055: variable 'controller_device' from source: play vars 15896 1727203891.31131: variable 'controller_device' from source: play vars 15896 1727203891.31146: variable 'port1_profile' from source: play vars 15896 1727203891.31218: variable 'port1_profile' from source: play vars 15896 1727203891.31230: variable 'dhcp_interface1' from source: play vars 15896 1727203891.31298: variable 'dhcp_interface1' from source: play vars 15896 1727203891.31312: variable 'controller_profile' from source: play vars 15896 1727203891.31380: variable 'controller_profile' from source: play vars 15896 1727203891.31393: variable 'port2_profile' from source: play vars 15896 1727203891.31459: variable 'port2_profile' from source: play vars 15896 1727203891.31481: variable 'dhcp_interface2' from source: play vars 15896 1727203891.31646: variable 'dhcp_interface2' from source: play vars 15896 1727203891.31649: variable 'controller_profile' from source: play vars 15896 1727203891.31651: variable 'controller_profile' from source: play vars 15896 1727203891.31653: variable 'ansible_distribution' from source: facts 15896 1727203891.31655: variable '__network_rh_distros' from source: role '' defaults 15896 1727203891.31657: variable 'ansible_distribution_major_version' from source: facts 15896 1727203891.31678: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203891.31873: variable 'ansible_distribution' from source: facts 15896 1727203891.31885: variable '__network_rh_distros' from source: role '' defaults 15896 1727203891.31894: variable 'ansible_distribution_major_version' from source: facts 15896 1727203891.31912: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203891.32100: variable 'ansible_distribution' from source: facts 15896 1727203891.32109: variable '__network_rh_distros' from source: role '' defaults 15896 1727203891.32118: variable 'ansible_distribution_major_version' from source: facts 15896 1727203891.32163: variable 'network_provider' from source: set_fact 15896 1727203891.32199: variable 'omit' from source: magic vars 15896 1727203891.32233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203891.32271: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203891.32302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203891.32324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203891.32338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203891.32373: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203891.32384: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203891.32392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203891.32502: Set connection var ansible_shell_type to sh 15896 1727203891.32624: Set connection var ansible_connection to ssh 15896 1727203891.32627: Set connection var ansible_shell_executable to /bin/sh 15896 1727203891.32629: Set connection var ansible_pipelining to False 15896 1727203891.32631: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203891.32633: Set connection var ansible_timeout to 10 15896 1727203891.32635: variable 'ansible_shell_executable' from source: unknown 15896 1727203891.32637: variable 'ansible_connection' from source: unknown 15896 1727203891.32639: variable 'ansible_module_compression' from source: unknown 15896 1727203891.32641: variable 'ansible_shell_type' from source: unknown 15896 1727203891.32643: variable 'ansible_shell_executable' from source: unknown 15896 1727203891.32645: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203891.32647: variable 'ansible_pipelining' from source: unknown 15896 1727203891.32648: variable 'ansible_timeout' from source: unknown 15896 1727203891.32650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203891.32734: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203891.32750: variable 'omit' from source: magic vars 15896 1727203891.32763: starting attempt loop 15896 1727203891.32771: running the handler 15896 1727203891.32862: variable 'ansible_facts' from source: unknown 15896 1727203891.33843: _low_level_execute_command(): starting 15896 1727203891.33858: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203891.34643: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203891.34696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203891.34713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203891.34725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203891.34795: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203891.34846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203891.34863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203891.34888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203891.35003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203891.36785: stdout chunk (state=3): >>>/root <<< 15896 1727203891.36945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203891.36948: stdout chunk (state=3): >>><<< 15896 1727203891.36951: stderr chunk (state=3): >>><<< 15896 1727203891.37071: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203891.37084: _low_level_execute_command(): starting 15896 1727203891.37088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143 `" && echo ansible-tmp-1727203891.3697805-18564-87149251414143="` echo /root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143 `" ) && sleep 0' 15896 1727203891.37656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203891.37704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203891.37786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203891.39906: stdout chunk (state=3): >>>ansible-tmp-1727203891.3697805-18564-87149251414143=/root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143 <<< 15896 1727203891.40078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203891.40082: stdout chunk (state=3): >>><<< 15896 1727203891.40085: stderr chunk (state=3): >>><<< 15896 1727203891.40101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203891.3697805-18564-87149251414143=/root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203891.40142: variable 'ansible_module_compression' from source: unknown 15896 1727203891.40210: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15896 1727203891.40482: variable 'ansible_facts' from source: unknown 15896 1727203891.40520: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/AnsiballZ_systemd.py 15896 1727203891.40724: Sending initial data 15896 1727203891.40734: Sent initial data (155 bytes) 15896 1727203891.41424: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203891.41442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203891.41586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203891.41600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203891.41642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203891.41757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203891.43642: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203891.44035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203891.44258: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpwwzybk4d /root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/AnsiballZ_systemd.py <<< 15896 1727203891.44273: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/AnsiballZ_systemd.py" <<< 15896 1727203891.44421: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpwwzybk4d" to remote "/root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/AnsiballZ_systemd.py" <<< 15896 1727203891.48021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203891.48167: stderr chunk (state=3): >>><<< 15896 1727203891.48171: stdout chunk (state=3): >>><<< 15896 1727203891.48173: done transferring module to remote 15896 1727203891.48181: _low_level_execute_command(): starting 15896 1727203891.48184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/ /root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/AnsiballZ_systemd.py && sleep 0' 15896 1727203891.48749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203891.48766: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203891.48784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203891.48900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203891.48915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203891.48929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203891.49047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203891.51183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203891.51188: stdout chunk (state=3): >>><<< 15896 1727203891.51191: stderr chunk (state=3): >>><<< 15896 1727203891.51193: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203891.51195: _low_level_execute_command(): starting 15896 1727203891.51198: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/AnsiballZ_systemd.py && sleep 0' 15896 1727203891.51872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203891.51925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203891.51937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203891.51957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203891.52072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203891.83685: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call or<<< 15896 1727203891.83690: stdout chunk (state=3): >>>g.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10600448", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3289059328", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "870063000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "Private<<< 15896 1727203891.83697: stdout chunk (state=3): >>>IPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15896 1727203891.85894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203891.86018: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 15896 1727203891.86022: stdout chunk (state=3): >>><<< 15896 1727203891.86027: stderr chunk (state=3): >>><<< 15896 1727203891.86047: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10600448", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3289059328", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "870063000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203891.86444: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203891.86463: _low_level_execute_command(): starting 15896 1727203891.86470: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203891.3697805-18564-87149251414143/ > /dev/null 2>&1 && sleep 0' 15896 1727203891.87813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203891.87832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203891.87838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203891.87850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203891.87856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203891.87862: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203891.87874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203891.88044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203891.88113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203891.88177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203891.90346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203891.90681: stderr chunk (state=3): >>><<< 15896 1727203891.90685: stdout chunk (state=3): >>><<< 15896 1727203891.90688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203891.90690: handler run complete 15896 1727203891.90692: attempt loop complete, returning result 15896 1727203891.90694: _execute() done 15896 1727203891.90696: dumping result to json 15896 1727203891.90698: done dumping result, returning 15896 1727203891.90700: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-fb83-b6ad-0000000000db] 15896 1727203891.90702: sending task result for task 028d2410-947f-fb83-b6ad-0000000000db 15896 1727203891.91282: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000db 15896 1727203891.91285: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203891.91338: no more pending results, returning what we have 15896 1727203891.91342: results queue empty 15896 1727203891.91342: checking for any_errors_fatal 15896 1727203891.91348: done checking for any_errors_fatal 15896 1727203891.91349: checking for max_fail_percentage 15896 1727203891.91350: done checking for max_fail_percentage 15896 1727203891.91351: checking to see if all hosts have failed and the running result is not ok 15896 1727203891.91352: done checking to see if all hosts have failed 15896 1727203891.91353: getting the remaining hosts for this loop 15896 1727203891.91354: done getting the remaining hosts for this loop 15896 1727203891.91357: getting the next task for host managed-node1 15896 1727203891.91363: done getting next task for host managed-node1 15896 1727203891.91367: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203891.91370: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203891.91386: getting variables 15896 1727203891.91388: in VariableManager get_vars() 15896 1727203891.91467: Calling all_inventory to load vars for managed-node1 15896 1727203891.91469: Calling groups_inventory to load vars for managed-node1 15896 1727203891.91472: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203891.91684: Calling all_plugins_play to load vars for managed-node1 15896 1727203891.91688: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203891.91691: Calling groups_plugins_play to load vars for managed-node1 15896 1727203891.95977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203892.00146: done with get_vars() 15896 1727203892.00181: done getting variables 15896 1727203892.00436: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:51:32 -0400 (0:00:00.872) 0:00:37.594 ***** 15896 1727203892.00470: entering _queue_task() for managed-node1/service 15896 1727203892.01598: worker is 1 (out of 1 available) 15896 1727203892.01608: exiting _queue_task() for managed-node1/service 15896 1727203892.01621: done queuing things up, now waiting for results queue to drain 15896 1727203892.01622: waiting for pending results... 15896 1727203892.02095: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203892.02447: in run() - task 028d2410-947f-fb83-b6ad-0000000000dc 15896 1727203892.02451: variable 'ansible_search_path' from source: unknown 15896 1727203892.02453: variable 'ansible_search_path' from source: unknown 15896 1727203892.02458: calling self._execute() 15896 1727203892.02648: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203892.02782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203892.02801: variable 'omit' from source: magic vars 15896 1727203892.03598: variable 'ansible_distribution_major_version' from source: facts 15896 1727203892.03616: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203892.03734: variable 'network_provider' from source: set_fact 15896 1727203892.03744: Evaluated conditional (network_provider == "nm"): True 15896 1727203892.03837: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203892.03932: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203892.04099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203892.07843: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203892.08067: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203892.08112: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203892.08171: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203892.08277: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203892.08494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203892.08570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203892.08605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203892.08721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203892.08739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203892.08981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203892.08984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203892.08986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203892.09104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203892.09213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203892.09216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203892.09219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203892.09331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203892.09378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203892.09447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203892.09757: variable 'network_connections' from source: task vars 15896 1727203892.09799: variable 'controller_profile' from source: play vars 15896 1727203892.09937: variable 'controller_profile' from source: play vars 15896 1727203892.09991: variable 'controller_device' from source: play vars 15896 1727203892.10144: variable 'controller_device' from source: play vars 15896 1727203892.10158: variable 'port1_profile' from source: play vars 15896 1727203892.10411: variable 'port1_profile' from source: play vars 15896 1727203892.10414: variable 'dhcp_interface1' from source: play vars 15896 1727203892.10416: variable 'dhcp_interface1' from source: play vars 15896 1727203892.10418: variable 'controller_profile' from source: play vars 15896 1727203892.10570: variable 'controller_profile' from source: play vars 15896 1727203892.10637: variable 'port2_profile' from source: play vars 15896 1727203892.10702: variable 'port2_profile' from source: play vars 15896 1727203892.10746: variable 'dhcp_interface2' from source: play vars 15896 1727203892.10908: variable 'dhcp_interface2' from source: play vars 15896 1727203892.10920: variable 'controller_profile' from source: play vars 15896 1727203892.11173: variable 'controller_profile' from source: play vars 15896 1727203892.11179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203892.11564: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203892.11648: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203892.11746: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203892.11781: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203892.11871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203892.11958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203892.11992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203892.12265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203892.12268: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203892.12755: variable 'network_connections' from source: task vars 15896 1727203892.12817: variable 'controller_profile' from source: play vars 15896 1727203892.12981: variable 'controller_profile' from source: play vars 15896 1727203892.12984: variable 'controller_device' from source: play vars 15896 1727203892.13072: variable 'controller_device' from source: play vars 15896 1727203892.13147: variable 'port1_profile' from source: play vars 15896 1727203892.13214: variable 'port1_profile' from source: play vars 15896 1727203892.13362: variable 'dhcp_interface1' from source: play vars 15896 1727203892.13428: variable 'dhcp_interface1' from source: play vars 15896 1727203892.13470: variable 'controller_profile' from source: play vars 15896 1727203892.13569: variable 'controller_profile' from source: play vars 15896 1727203892.13697: variable 'port2_profile' from source: play vars 15896 1727203892.13754: variable 'port2_profile' from source: play vars 15896 1727203892.13770: variable 'dhcp_interface2' from source: play vars 15896 1727203892.13862: variable 'dhcp_interface2' from source: play vars 15896 1727203892.13924: variable 'controller_profile' from source: play vars 15896 1727203892.14062: variable 'controller_profile' from source: play vars 15896 1727203892.14215: Evaluated conditional (__network_wpa_supplicant_required): False 15896 1727203892.14218: when evaluation is False, skipping this task 15896 1727203892.14221: _execute() done 15896 1727203892.14223: dumping result to json 15896 1727203892.14225: done dumping result, returning 15896 1727203892.14232: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-fb83-b6ad-0000000000dc] 15896 1727203892.14247: sending task result for task 028d2410-947f-fb83-b6ad-0000000000dc 15896 1727203892.14807: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000dc 15896 1727203892.14810: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15896 1727203892.14857: no more pending results, returning what we have 15896 1727203892.14863: results queue empty 15896 1727203892.14864: checking for any_errors_fatal 15896 1727203892.14879: done checking for any_errors_fatal 15896 1727203892.14880: checking for max_fail_percentage 15896 1727203892.14882: done checking for max_fail_percentage 15896 1727203892.14882: checking to see if all hosts have failed and the running result is not ok 15896 1727203892.14883: done checking to see if all hosts have failed 15896 1727203892.14888: getting the remaining hosts for this loop 15896 1727203892.14891: done getting the remaining hosts for this loop 15896 1727203892.14895: getting the next task for host managed-node1 15896 1727203892.14901: done getting next task for host managed-node1 15896 1727203892.14905: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203892.14908: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203892.14927: getting variables 15896 1727203892.14929: in VariableManager get_vars() 15896 1727203892.15194: Calling all_inventory to load vars for managed-node1 15896 1727203892.15197: Calling groups_inventory to load vars for managed-node1 15896 1727203892.15200: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203892.15211: Calling all_plugins_play to load vars for managed-node1 15896 1727203892.15214: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203892.15217: Calling groups_plugins_play to load vars for managed-node1 15896 1727203892.18135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203892.21458: done with get_vars() 15896 1727203892.21611: done getting variables 15896 1727203892.21781: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:51:32 -0400 (0:00:00.213) 0:00:37.807 ***** 15896 1727203892.21815: entering _queue_task() for managed-node1/service 15896 1727203892.22646: worker is 1 (out of 1 available) 15896 1727203892.22662: exiting _queue_task() for managed-node1/service 15896 1727203892.22674: done queuing things up, now waiting for results queue to drain 15896 1727203892.22678: waiting for pending results... 15896 1727203892.23195: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203892.23525: in run() - task 028d2410-947f-fb83-b6ad-0000000000dd 15896 1727203892.23586: variable 'ansible_search_path' from source: unknown 15896 1727203892.23670: variable 'ansible_search_path' from source: unknown 15896 1727203892.23702: calling self._execute() 15896 1727203892.23985: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203892.24181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203892.24185: variable 'omit' from source: magic vars 15896 1727203892.24913: variable 'ansible_distribution_major_version' from source: facts 15896 1727203892.24983: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203892.25221: variable 'network_provider' from source: set_fact 15896 1727203892.25383: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203892.25387: when evaluation is False, skipping this task 15896 1727203892.25389: _execute() done 15896 1727203892.25394: dumping result to json 15896 1727203892.25396: done dumping result, returning 15896 1727203892.25398: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-fb83-b6ad-0000000000dd] 15896 1727203892.25400: sending task result for task 028d2410-947f-fb83-b6ad-0000000000dd 15896 1727203892.25478: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000dd 15896 1727203892.25482: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203892.25533: no more pending results, returning what we have 15896 1727203892.25537: results queue empty 15896 1727203892.25538: checking for any_errors_fatal 15896 1727203892.25546: done checking for any_errors_fatal 15896 1727203892.25547: checking for max_fail_percentage 15896 1727203892.25549: done checking for max_fail_percentage 15896 1727203892.25550: checking to see if all hosts have failed and the running result is not ok 15896 1727203892.25551: done checking to see if all hosts have failed 15896 1727203892.25551: getting the remaining hosts for this loop 15896 1727203892.25553: done getting the remaining hosts for this loop 15896 1727203892.25557: getting the next task for host managed-node1 15896 1727203892.25567: done getting next task for host managed-node1 15896 1727203892.25571: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203892.25577: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203892.25604: getting variables 15896 1727203892.25606: in VariableManager get_vars() 15896 1727203892.25667: Calling all_inventory to load vars for managed-node1 15896 1727203892.25670: Calling groups_inventory to load vars for managed-node1 15896 1727203892.25672: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203892.25907: Calling all_plugins_play to load vars for managed-node1 15896 1727203892.25912: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203892.25915: Calling groups_plugins_play to load vars for managed-node1 15896 1727203892.29307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203892.32715: done with get_vars() 15896 1727203892.32749: done getting variables 15896 1727203892.32927: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:51:32 -0400 (0:00:00.111) 0:00:37.918 ***** 15896 1727203892.32967: entering _queue_task() for managed-node1/copy 15896 1727203892.34006: worker is 1 (out of 1 available) 15896 1727203892.34019: exiting _queue_task() for managed-node1/copy 15896 1727203892.34033: done queuing things up, now waiting for results queue to drain 15896 1727203892.34035: waiting for pending results... 15896 1727203892.34943: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203892.34971: in run() - task 028d2410-947f-fb83-b6ad-0000000000de 15896 1727203892.35149: variable 'ansible_search_path' from source: unknown 15896 1727203892.35153: variable 'ansible_search_path' from source: unknown 15896 1727203892.35155: calling self._execute() 15896 1727203892.35370: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203892.35387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203892.35585: variable 'omit' from source: magic vars 15896 1727203892.36297: variable 'ansible_distribution_major_version' from source: facts 15896 1727203892.36363: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203892.36604: variable 'network_provider' from source: set_fact 15896 1727203892.36617: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203892.36783: when evaluation is False, skipping this task 15896 1727203892.36786: _execute() done 15896 1727203892.36788: dumping result to json 15896 1727203892.36789: done dumping result, returning 15896 1727203892.36793: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-fb83-b6ad-0000000000de] 15896 1727203892.36795: sending task result for task 028d2410-947f-fb83-b6ad-0000000000de 15896 1727203892.36858: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000de 15896 1727203892.36863: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203892.36932: no more pending results, returning what we have 15896 1727203892.36937: results queue empty 15896 1727203892.36938: checking for any_errors_fatal 15896 1727203892.36946: done checking for any_errors_fatal 15896 1727203892.36946: checking for max_fail_percentage 15896 1727203892.36949: done checking for max_fail_percentage 15896 1727203892.36950: checking to see if all hosts have failed and the running result is not ok 15896 1727203892.36950: done checking to see if all hosts have failed 15896 1727203892.36951: getting the remaining hosts for this loop 15896 1727203892.36953: done getting the remaining hosts for this loop 15896 1727203892.36956: getting the next task for host managed-node1 15896 1727203892.36965: done getting next task for host managed-node1 15896 1727203892.36969: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203892.36973: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203892.36999: getting variables 15896 1727203892.37002: in VariableManager get_vars() 15896 1727203892.37064: Calling all_inventory to load vars for managed-node1 15896 1727203892.37067: Calling groups_inventory to load vars for managed-node1 15896 1727203892.37069: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203892.37387: Calling all_plugins_play to load vars for managed-node1 15896 1727203892.37392: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203892.37395: Calling groups_plugins_play to load vars for managed-node1 15896 1727203892.40403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203892.44035: done with get_vars() 15896 1727203892.44068: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:51:32 -0400 (0:00:00.113) 0:00:38.032 ***** 15896 1727203892.44346: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203892.45278: worker is 1 (out of 1 available) 15896 1727203892.45290: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203892.45302: done queuing things up, now waiting for results queue to drain 15896 1727203892.45304: waiting for pending results... 15896 1727203892.45781: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203892.46202: in run() - task 028d2410-947f-fb83-b6ad-0000000000df 15896 1727203892.46207: variable 'ansible_search_path' from source: unknown 15896 1727203892.46209: variable 'ansible_search_path' from source: unknown 15896 1727203892.46212: calling self._execute() 15896 1727203892.46378: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203892.46428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203892.46445: variable 'omit' from source: magic vars 15896 1727203892.47258: variable 'ansible_distribution_major_version' from source: facts 15896 1727203892.47306: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203892.47483: variable 'omit' from source: magic vars 15896 1727203892.47486: variable 'omit' from source: magic vars 15896 1727203892.47818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203892.65987: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203892.66202: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203892.66206: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203892.66208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203892.66380: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203892.66438: variable 'network_provider' from source: set_fact 15896 1727203892.66683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203892.66780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203892.66811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203892.66900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203892.66985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203892.67140: variable 'omit' from source: magic vars 15896 1727203892.67373: variable 'omit' from source: magic vars 15896 1727203892.67639: variable 'network_connections' from source: task vars 15896 1727203892.67654: variable 'controller_profile' from source: play vars 15896 1727203892.67780: variable 'controller_profile' from source: play vars 15896 1727203892.67795: variable 'controller_device' from source: play vars 15896 1727203892.67895: variable 'controller_device' from source: play vars 15896 1727203892.67955: variable 'port1_profile' from source: play vars 15896 1727203892.68109: variable 'port1_profile' from source: play vars 15896 1727203892.68119: variable 'dhcp_interface1' from source: play vars 15896 1727203892.68380: variable 'dhcp_interface1' from source: play vars 15896 1727203892.68383: variable 'controller_profile' from source: play vars 15896 1727203892.68385: variable 'controller_profile' from source: play vars 15896 1727203892.68388: variable 'port2_profile' from source: play vars 15896 1727203892.68520: variable 'port2_profile' from source: play vars 15896 1727203892.68530: variable 'dhcp_interface2' from source: play vars 15896 1727203892.68703: variable 'dhcp_interface2' from source: play vars 15896 1727203892.68780: variable 'controller_profile' from source: play vars 15896 1727203892.68783: variable 'controller_profile' from source: play vars 15896 1727203892.69134: variable 'omit' from source: magic vars 15896 1727203892.69151: variable '__lsr_ansible_managed' from source: task vars 15896 1727203892.69213: variable '__lsr_ansible_managed' from source: task vars 15896 1727203892.69562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15896 1727203892.70217: Loaded config def from plugin (lookup/template) 15896 1727203892.70220: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15896 1727203892.70224: File lookup term: get_ansible_managed.j2 15896 1727203892.70226: variable 'ansible_search_path' from source: unknown 15896 1727203892.70228: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15896 1727203892.70231: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15896 1727203892.70234: variable 'ansible_search_path' from source: unknown 15896 1727203892.80281: variable 'ansible_managed' from source: unknown 15896 1727203892.80432: variable 'omit' from source: magic vars 15896 1727203892.80469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203892.80525: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203892.80528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203892.80541: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203892.80555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203892.80588: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203892.80597: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203892.80632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203892.80709: Set connection var ansible_shell_type to sh 15896 1727203892.80721: Set connection var ansible_connection to ssh 15896 1727203892.80731: Set connection var ansible_shell_executable to /bin/sh 15896 1727203892.80747: Set connection var ansible_pipelining to False 15896 1727203892.80782: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203892.80792: Set connection var ansible_timeout to 10 15896 1727203892.80807: variable 'ansible_shell_executable' from source: unknown 15896 1727203892.80850: variable 'ansible_connection' from source: unknown 15896 1727203892.80853: variable 'ansible_module_compression' from source: unknown 15896 1727203892.80856: variable 'ansible_shell_type' from source: unknown 15896 1727203892.80858: variable 'ansible_shell_executable' from source: unknown 15896 1727203892.80863: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203892.80865: variable 'ansible_pipelining' from source: unknown 15896 1727203892.80867: variable 'ansible_timeout' from source: unknown 15896 1727203892.80869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203892.80990: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203892.81008: variable 'omit' from source: magic vars 15896 1727203892.81069: starting attempt loop 15896 1727203892.81072: running the handler 15896 1727203892.81076: _low_level_execute_command(): starting 15896 1727203892.81079: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203892.81845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203892.81933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203892.81977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203892.82158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203892.84337: stdout chunk (state=3): >>>/root <<< 15896 1727203892.84340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203892.84343: stdout chunk (state=3): >>><<< 15896 1727203892.84345: stderr chunk (state=3): >>><<< 15896 1727203892.84348: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203892.84350: _low_level_execute_command(): starting 15896 1727203892.84353: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478 `" && echo ansible-tmp-1727203892.8430178-18651-127100256094478="` echo /root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478 `" ) && sleep 0' 15896 1727203892.85453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203892.85457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203892.85468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203892.85484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203892.85502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203892.85508: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203892.85518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203892.85607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203892.85727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203892.85869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203892.88370: stdout chunk (state=3): >>>ansible-tmp-1727203892.8430178-18651-127100256094478=/root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478 <<< 15896 1727203892.88433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203892.88437: stdout chunk (state=3): >>><<< 15896 1727203892.88442: stderr chunk (state=3): >>><<< 15896 1727203892.88467: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203892.8430178-18651-127100256094478=/root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203892.88508: variable 'ansible_module_compression' from source: unknown 15896 1727203892.88548: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15896 1727203892.88583: variable 'ansible_facts' from source: unknown 15896 1727203892.89089: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/AnsiballZ_network_connections.py 15896 1727203892.89548: Sending initial data 15896 1727203892.89551: Sent initial data (168 bytes) 15896 1727203892.90486: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203892.90882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203892.91120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203892.91206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203892.92945: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203892.93020: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203892.93094: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpna0g_w9e /root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/AnsiballZ_network_connections.py <<< 15896 1727203892.93098: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/AnsiballZ_network_connections.py" <<< 15896 1727203892.93171: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpna0g_w9e" to remote "/root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/AnsiballZ_network_connections.py" <<< 15896 1727203892.95842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203892.95880: stderr chunk (state=3): >>><<< 15896 1727203892.95888: stdout chunk (state=3): >>><<< 15896 1727203892.95934: done transferring module to remote 15896 1727203892.95946: _low_level_execute_command(): starting 15896 1727203892.95951: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/ /root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/AnsiballZ_network_connections.py && sleep 0' 15896 1727203892.97330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203892.97540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203892.97547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203892.97550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203892.97552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203892.97554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203892.97591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203892.97682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203892.99704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203892.99708: stdout chunk (state=3): >>><<< 15896 1727203892.99712: stderr chunk (state=3): >>><<< 15896 1727203892.99731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203892.99734: _low_level_execute_command(): starting 15896 1727203892.99739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/AnsiballZ_network_connections.py && sleep 0' 15896 1727203893.02080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203893.02084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203893.02086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203893.02089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203893.02254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203893.02358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203893.02695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203893.58527: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15896 1727203893.61004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203893.61009: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 15896 1727203893.61074: stderr chunk (state=3): >>><<< 15896 1727203893.61293: stdout chunk (state=3): >>><<< 15896 1727203893.61316: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203893.61380: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203893.61388: _low_level_execute_command(): starting 15896 1727203893.61394: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203892.8430178-18651-127100256094478/ > /dev/null 2>&1 && sleep 0' 15896 1727203893.62434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203893.62785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203893.62802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203893.62844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203893.62922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203893.64963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203893.65091: stderr chunk (state=3): >>><<< 15896 1727203893.65192: stdout chunk (state=3): >>><<< 15896 1727203893.65217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203893.65281: handler run complete 15896 1727203893.65285: attempt loop complete, returning result 15896 1727203893.65287: _execute() done 15896 1727203893.65289: dumping result to json 15896 1727203893.65291: done dumping result, returning 15896 1727203893.65293: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-fb83-b6ad-0000000000df] 15896 1727203893.65294: sending task result for task 028d2410-947f-fb83-b6ad-0000000000df 15896 1727203893.65440: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000df 15896 1727203893.65443: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3 [008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c [009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active) 15896 1727203893.65597: no more pending results, returning what we have 15896 1727203893.65600: results queue empty 15896 1727203893.65601: checking for any_errors_fatal 15896 1727203893.65606: done checking for any_errors_fatal 15896 1727203893.65607: checking for max_fail_percentage 15896 1727203893.65608: done checking for max_fail_percentage 15896 1727203893.65609: checking to see if all hosts have failed and the running result is not ok 15896 1727203893.65610: done checking to see if all hosts have failed 15896 1727203893.65611: getting the remaining hosts for this loop 15896 1727203893.65612: done getting the remaining hosts for this loop 15896 1727203893.65615: getting the next task for host managed-node1 15896 1727203893.65621: done getting next task for host managed-node1 15896 1727203893.65625: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203893.65628: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203893.65641: getting variables 15896 1727203893.65643: in VariableManager get_vars() 15896 1727203893.66062: Calling all_inventory to load vars for managed-node1 15896 1727203893.66065: Calling groups_inventory to load vars for managed-node1 15896 1727203893.66068: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203893.66083: Calling all_plugins_play to load vars for managed-node1 15896 1727203893.66087: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203893.66091: Calling groups_plugins_play to load vars for managed-node1 15896 1727203893.79196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203893.81055: done with get_vars() 15896 1727203893.81141: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:51:33 -0400 (0:00:01.369) 0:00:39.402 ***** 15896 1727203893.81330: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203893.82067: worker is 1 (out of 1 available) 15896 1727203893.82341: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203893.82355: done queuing things up, now waiting for results queue to drain 15896 1727203893.82356: waiting for pending results... 15896 1727203893.83009: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203893.83106: in run() - task 028d2410-947f-fb83-b6ad-0000000000e0 15896 1727203893.83110: variable 'ansible_search_path' from source: unknown 15896 1727203893.83114: variable 'ansible_search_path' from source: unknown 15896 1727203893.83163: calling self._execute() 15896 1727203893.83289: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203893.83547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203893.83551: variable 'omit' from source: magic vars 15896 1727203893.84086: variable 'ansible_distribution_major_version' from source: facts 15896 1727203893.84090: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203893.84094: variable 'network_state' from source: role '' defaults 15896 1727203893.84097: Evaluated conditional (network_state != {}): False 15896 1727203893.84100: when evaluation is False, skipping this task 15896 1727203893.84103: _execute() done 15896 1727203893.84105: dumping result to json 15896 1727203893.84107: done dumping result, returning 15896 1727203893.84110: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-fb83-b6ad-0000000000e0] 15896 1727203893.84113: sending task result for task 028d2410-947f-fb83-b6ad-0000000000e0 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203893.84291: no more pending results, returning what we have 15896 1727203893.84296: results queue empty 15896 1727203893.84297: checking for any_errors_fatal 15896 1727203893.84313: done checking for any_errors_fatal 15896 1727203893.84314: checking for max_fail_percentage 15896 1727203893.84316: done checking for max_fail_percentage 15896 1727203893.84316: checking to see if all hosts have failed and the running result is not ok 15896 1727203893.84317: done checking to see if all hosts have failed 15896 1727203893.84318: getting the remaining hosts for this loop 15896 1727203893.84319: done getting the remaining hosts for this loop 15896 1727203893.84322: getting the next task for host managed-node1 15896 1727203893.84328: done getting next task for host managed-node1 15896 1727203893.84332: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203893.84335: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203893.84358: getting variables 15896 1727203893.84359: in VariableManager get_vars() 15896 1727203893.84407: Calling all_inventory to load vars for managed-node1 15896 1727203893.84410: Calling groups_inventory to load vars for managed-node1 15896 1727203893.84412: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203893.84420: Calling all_plugins_play to load vars for managed-node1 15896 1727203893.84423: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203893.84425: Calling groups_plugins_play to load vars for managed-node1 15896 1727203893.85030: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000e0 15896 1727203893.85034: WORKER PROCESS EXITING 15896 1727203893.87581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203893.89616: done with get_vars() 15896 1727203893.89651: done getting variables 15896 1727203893.89714: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:51:33 -0400 (0:00:00.084) 0:00:39.486 ***** 15896 1727203893.89761: entering _queue_task() for managed-node1/debug 15896 1727203893.90511: worker is 1 (out of 1 available) 15896 1727203893.90520: exiting _queue_task() for managed-node1/debug 15896 1727203893.90535: done queuing things up, now waiting for results queue to drain 15896 1727203893.90536: waiting for pending results... 15896 1727203893.90708: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203893.90894: in run() - task 028d2410-947f-fb83-b6ad-0000000000e1 15896 1727203893.90986: variable 'ansible_search_path' from source: unknown 15896 1727203893.90990: variable 'ansible_search_path' from source: unknown 15896 1727203893.90994: calling self._execute() 15896 1727203893.91100: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203893.91114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203893.91181: variable 'omit' from source: magic vars 15896 1727203893.91607: variable 'ansible_distribution_major_version' from source: facts 15896 1727203893.91624: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203893.91645: variable 'omit' from source: magic vars 15896 1727203893.91768: variable 'omit' from source: magic vars 15896 1727203893.91810: variable 'omit' from source: magic vars 15896 1727203893.91900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203893.91905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203893.91922: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203893.91935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203893.91945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203893.92001: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203893.92005: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203893.92007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203893.92079: Set connection var ansible_shell_type to sh 15896 1727203893.92086: Set connection var ansible_connection to ssh 15896 1727203893.92091: Set connection var ansible_shell_executable to /bin/sh 15896 1727203893.92096: Set connection var ansible_pipelining to False 15896 1727203893.92102: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203893.92106: Set connection var ansible_timeout to 10 15896 1727203893.92122: variable 'ansible_shell_executable' from source: unknown 15896 1727203893.92127: variable 'ansible_connection' from source: unknown 15896 1727203893.92130: variable 'ansible_module_compression' from source: unknown 15896 1727203893.92132: variable 'ansible_shell_type' from source: unknown 15896 1727203893.92134: variable 'ansible_shell_executable' from source: unknown 15896 1727203893.92138: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203893.92141: variable 'ansible_pipelining' from source: unknown 15896 1727203893.92143: variable 'ansible_timeout' from source: unknown 15896 1727203893.92148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203893.92246: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203893.92265: variable 'omit' from source: magic vars 15896 1727203893.92269: starting attempt loop 15896 1727203893.92271: running the handler 15896 1727203893.92365: variable '__network_connections_result' from source: set_fact 15896 1727203893.92425: handler run complete 15896 1727203893.92438: attempt loop complete, returning result 15896 1727203893.92441: _execute() done 15896 1727203893.92444: dumping result to json 15896 1727203893.92446: done dumping result, returning 15896 1727203893.92455: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-fb83-b6ad-0000000000e1] 15896 1727203893.92458: sending task result for task 028d2410-947f-fb83-b6ad-0000000000e1 15896 1727203893.92542: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000e1 15896 1727203893.92545: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)" ] } 15896 1727203893.92620: no more pending results, returning what we have 15896 1727203893.92623: results queue empty 15896 1727203893.92624: checking for any_errors_fatal 15896 1727203893.92627: done checking for any_errors_fatal 15896 1727203893.92628: checking for max_fail_percentage 15896 1727203893.92630: done checking for max_fail_percentage 15896 1727203893.92630: checking to see if all hosts have failed and the running result is not ok 15896 1727203893.92631: done checking to see if all hosts have failed 15896 1727203893.92631: getting the remaining hosts for this loop 15896 1727203893.92633: done getting the remaining hosts for this loop 15896 1727203893.92636: getting the next task for host managed-node1 15896 1727203893.92642: done getting next task for host managed-node1 15896 1727203893.92645: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203893.92648: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203893.92664: getting variables 15896 1727203893.92666: in VariableManager get_vars() 15896 1727203893.92713: Calling all_inventory to load vars for managed-node1 15896 1727203893.92716: Calling groups_inventory to load vars for managed-node1 15896 1727203893.92718: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203893.92727: Calling all_plugins_play to load vars for managed-node1 15896 1727203893.92729: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203893.92732: Calling groups_plugins_play to load vars for managed-node1 15896 1727203893.93520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203893.94451: done with get_vars() 15896 1727203893.94472: done getting variables 15896 1727203893.94532: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:51:33 -0400 (0:00:00.048) 0:00:39.534 ***** 15896 1727203893.94567: entering _queue_task() for managed-node1/debug 15896 1727203893.95102: worker is 1 (out of 1 available) 15896 1727203893.95111: exiting _queue_task() for managed-node1/debug 15896 1727203893.95119: done queuing things up, now waiting for results queue to drain 15896 1727203893.95121: waiting for pending results... 15896 1727203893.95313: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203893.95318: in run() - task 028d2410-947f-fb83-b6ad-0000000000e2 15896 1727203893.95321: variable 'ansible_search_path' from source: unknown 15896 1727203893.95324: variable 'ansible_search_path' from source: unknown 15896 1727203893.95374: calling self._execute() 15896 1727203893.95470: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203893.95474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203893.95486: variable 'omit' from source: magic vars 15896 1727203893.95771: variable 'ansible_distribution_major_version' from source: facts 15896 1727203893.95782: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203893.95789: variable 'omit' from source: magic vars 15896 1727203893.95830: variable 'omit' from source: magic vars 15896 1727203893.95856: variable 'omit' from source: magic vars 15896 1727203893.95892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203893.95918: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203893.95937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203893.95950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203893.95963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203893.95985: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203893.95988: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203893.95993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203893.96062: Set connection var ansible_shell_type to sh 15896 1727203893.96066: Set connection var ansible_connection to ssh 15896 1727203893.96071: Set connection var ansible_shell_executable to /bin/sh 15896 1727203893.96077: Set connection var ansible_pipelining to False 15896 1727203893.96082: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203893.96087: Set connection var ansible_timeout to 10 15896 1727203893.96104: variable 'ansible_shell_executable' from source: unknown 15896 1727203893.96107: variable 'ansible_connection' from source: unknown 15896 1727203893.96110: variable 'ansible_module_compression' from source: unknown 15896 1727203893.96112: variable 'ansible_shell_type' from source: unknown 15896 1727203893.96114: variable 'ansible_shell_executable' from source: unknown 15896 1727203893.96116: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203893.96118: variable 'ansible_pipelining' from source: unknown 15896 1727203893.96120: variable 'ansible_timeout' from source: unknown 15896 1727203893.96124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203893.96224: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203893.96233: variable 'omit' from source: magic vars 15896 1727203893.96238: starting attempt loop 15896 1727203893.96241: running the handler 15896 1727203893.96284: variable '__network_connections_result' from source: set_fact 15896 1727203893.96337: variable '__network_connections_result' from source: set_fact 15896 1727203893.96450: handler run complete 15896 1727203893.96481: attempt loop complete, returning result 15896 1727203893.96484: _execute() done 15896 1727203893.96487: dumping result to json 15896 1727203893.96489: done dumping result, returning 15896 1727203893.96492: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-fb83-b6ad-0000000000e2] 15896 1727203893.96494: sending task result for task 028d2410-947f-fb83-b6ad-0000000000e2 15896 1727203893.96589: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000e2 15896 1727203893.96592: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 8404d61d-4c80-4763-affe-7d26fa7e8dd3 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, b00f4d37-e3cf-44e8-8e62-a68de6442d0c (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c5333c4a-95d0-44cb-b04e-077f82270820 (not-active)" ] } } 15896 1727203893.96689: no more pending results, returning what we have 15896 1727203893.96692: results queue empty 15896 1727203893.96698: checking for any_errors_fatal 15896 1727203893.96704: done checking for any_errors_fatal 15896 1727203893.96704: checking for max_fail_percentage 15896 1727203893.96706: done checking for max_fail_percentage 15896 1727203893.96707: checking to see if all hosts have failed and the running result is not ok 15896 1727203893.96707: done checking to see if all hosts have failed 15896 1727203893.96708: getting the remaining hosts for this loop 15896 1727203893.96709: done getting the remaining hosts for this loop 15896 1727203893.96712: getting the next task for host managed-node1 15896 1727203893.96717: done getting next task for host managed-node1 15896 1727203893.96720: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203893.96723: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203893.96733: getting variables 15896 1727203893.96734: in VariableManager get_vars() 15896 1727203893.96786: Calling all_inventory to load vars for managed-node1 15896 1727203893.96788: Calling groups_inventory to load vars for managed-node1 15896 1727203893.96791: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203893.96799: Calling all_plugins_play to load vars for managed-node1 15896 1727203893.96801: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203893.96803: Calling groups_plugins_play to load vars for managed-node1 15896 1727203893.97683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203893.98730: done with get_vars() 15896 1727203893.98750: done getting variables 15896 1727203893.98803: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:51:33 -0400 (0:00:00.042) 0:00:39.577 ***** 15896 1727203893.98837: entering _queue_task() for managed-node1/debug 15896 1727203893.99154: worker is 1 (out of 1 available) 15896 1727203893.99166: exiting _queue_task() for managed-node1/debug 15896 1727203893.99180: done queuing things up, now waiting for results queue to drain 15896 1727203893.99181: waiting for pending results... 15896 1727203893.99502: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203893.99606: in run() - task 028d2410-947f-fb83-b6ad-0000000000e3 15896 1727203893.99620: variable 'ansible_search_path' from source: unknown 15896 1727203893.99623: variable 'ansible_search_path' from source: unknown 15896 1727203893.99655: calling self._execute() 15896 1727203893.99755: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203893.99762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203893.99772: variable 'omit' from source: magic vars 15896 1727203894.00142: variable 'ansible_distribution_major_version' from source: facts 15896 1727203894.00146: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203894.00227: variable 'network_state' from source: role '' defaults 15896 1727203894.00236: Evaluated conditional (network_state != {}): False 15896 1727203894.00247: when evaluation is False, skipping this task 15896 1727203894.00251: _execute() done 15896 1727203894.00255: dumping result to json 15896 1727203894.00257: done dumping result, returning 15896 1727203894.00263: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-fb83-b6ad-0000000000e3] 15896 1727203894.00265: sending task result for task 028d2410-947f-fb83-b6ad-0000000000e3 15896 1727203894.00345: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000e3 15896 1727203894.00374: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 15896 1727203894.00426: no more pending results, returning what we have 15896 1727203894.00429: results queue empty 15896 1727203894.00430: checking for any_errors_fatal 15896 1727203894.00439: done checking for any_errors_fatal 15896 1727203894.00440: checking for max_fail_percentage 15896 1727203894.00441: done checking for max_fail_percentage 15896 1727203894.00442: checking to see if all hosts have failed and the running result is not ok 15896 1727203894.00443: done checking to see if all hosts have failed 15896 1727203894.00443: getting the remaining hosts for this loop 15896 1727203894.00445: done getting the remaining hosts for this loop 15896 1727203894.00448: getting the next task for host managed-node1 15896 1727203894.00453: done getting next task for host managed-node1 15896 1727203894.00457: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203894.00462: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203894.00489: getting variables 15896 1727203894.00491: in VariableManager get_vars() 15896 1727203894.00533: Calling all_inventory to load vars for managed-node1 15896 1727203894.00536: Calling groups_inventory to load vars for managed-node1 15896 1727203894.00538: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203894.00546: Calling all_plugins_play to load vars for managed-node1 15896 1727203894.00548: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203894.00551: Calling groups_plugins_play to load vars for managed-node1 15896 1727203894.01305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203894.02184: done with get_vars() 15896 1727203894.02199: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:51:34 -0400 (0:00:00.034) 0:00:39.612 ***** 15896 1727203894.02268: entering _queue_task() for managed-node1/ping 15896 1727203894.02567: worker is 1 (out of 1 available) 15896 1727203894.02781: exiting _queue_task() for managed-node1/ping 15896 1727203894.02791: done queuing things up, now waiting for results queue to drain 15896 1727203894.02793: waiting for pending results... 15896 1727203894.02875: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203894.03113: in run() - task 028d2410-947f-fb83-b6ad-0000000000e4 15896 1727203894.03117: variable 'ansible_search_path' from source: unknown 15896 1727203894.03120: variable 'ansible_search_path' from source: unknown 15896 1727203894.03124: calling self._execute() 15896 1727203894.03169: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203894.03174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203894.03186: variable 'omit' from source: magic vars 15896 1727203894.03570: variable 'ansible_distribution_major_version' from source: facts 15896 1727203894.03583: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203894.03590: variable 'omit' from source: magic vars 15896 1727203894.03679: variable 'omit' from source: magic vars 15896 1727203894.03770: variable 'omit' from source: magic vars 15896 1727203894.03773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203894.03778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203894.03788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203894.03812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203894.03830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203894.03852: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203894.03855: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203894.03858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203894.03983: Set connection var ansible_shell_type to sh 15896 1727203894.03986: Set connection var ansible_connection to ssh 15896 1727203894.03989: Set connection var ansible_shell_executable to /bin/sh 15896 1727203894.03991: Set connection var ansible_pipelining to False 15896 1727203894.03993: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203894.03997: Set connection var ansible_timeout to 10 15896 1727203894.03999: variable 'ansible_shell_executable' from source: unknown 15896 1727203894.04001: variable 'ansible_connection' from source: unknown 15896 1727203894.04004: variable 'ansible_module_compression' from source: unknown 15896 1727203894.04006: variable 'ansible_shell_type' from source: unknown 15896 1727203894.04007: variable 'ansible_shell_executable' from source: unknown 15896 1727203894.04011: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203894.04013: variable 'ansible_pipelining' from source: unknown 15896 1727203894.04015: variable 'ansible_timeout' from source: unknown 15896 1727203894.04017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203894.04185: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203894.04204: variable 'omit' from source: magic vars 15896 1727203894.04208: starting attempt loop 15896 1727203894.04210: running the handler 15896 1727203894.04299: _low_level_execute_command(): starting 15896 1727203894.04308: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203894.04975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203894.04994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203894.05010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203894.05047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203894.05099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203894.05113: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203894.05127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203894.05188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203894.05241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203894.05258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203894.05291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203894.05400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203894.07191: stdout chunk (state=3): >>>/root <<< 15896 1727203894.07289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203894.07323: stderr chunk (state=3): >>><<< 15896 1727203894.07326: stdout chunk (state=3): >>><<< 15896 1727203894.07345: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203894.07357: _low_level_execute_command(): starting 15896 1727203894.07366: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281 `" && echo ansible-tmp-1727203894.0734477-18708-89532393551281="` echo /root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281 `" ) && sleep 0' 15896 1727203894.07793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203894.07797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203894.07817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203894.07874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203894.07882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203894.07982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203894.10070: stdout chunk (state=3): >>>ansible-tmp-1727203894.0734477-18708-89532393551281=/root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281 <<< 15896 1727203894.10181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203894.10204: stderr chunk (state=3): >>><<< 15896 1727203894.10208: stdout chunk (state=3): >>><<< 15896 1727203894.10224: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203894.0734477-18708-89532393551281=/root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203894.10263: variable 'ansible_module_compression' from source: unknown 15896 1727203894.10307: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15896 1727203894.10337: variable 'ansible_facts' from source: unknown 15896 1727203894.10395: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/AnsiballZ_ping.py 15896 1727203894.10497: Sending initial data 15896 1727203894.10501: Sent initial data (152 bytes) 15896 1727203894.10961: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203894.10965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203894.10967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203894.10970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203894.10971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203894.11016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203894.11019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203894.11025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203894.11103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203894.12844: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203894.12945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203894.13025: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpu6a8_y_0 /root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/AnsiballZ_ping.py <<< 15896 1727203894.13029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/AnsiballZ_ping.py" <<< 15896 1727203894.13110: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpu6a8_y_0" to remote "/root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/AnsiballZ_ping.py" <<< 15896 1727203894.14867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203894.14873: stdout chunk (state=3): >>><<< 15896 1727203894.14877: stderr chunk (state=3): >>><<< 15896 1727203894.14919: done transferring module to remote 15896 1727203894.14936: _low_level_execute_command(): starting 15896 1727203894.14940: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/ /root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/AnsiballZ_ping.py && sleep 0' 15896 1727203894.15906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203894.15923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203894.15936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203894.15964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203894.16409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203894.18270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203894.18300: stderr chunk (state=3): >>><<< 15896 1727203894.18303: stdout chunk (state=3): >>><<< 15896 1727203894.18318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203894.18321: _low_level_execute_command(): starting 15896 1727203894.18326: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/AnsiballZ_ping.py && sleep 0' 15896 1727203894.18750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203894.18754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203894.18782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203894.18785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203894.18788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203894.18844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203894.18847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203894.18850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203894.18944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203894.35381: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15896 1727203894.36815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203894.36818: stdout chunk (state=3): >>><<< 15896 1727203894.36821: stderr chunk (state=3): >>><<< 15896 1727203894.37148: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203894.37153: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203894.37156: _low_level_execute_command(): starting 15896 1727203894.37158: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203894.0734477-18708-89532393551281/ > /dev/null 2>&1 && sleep 0' 15896 1727203894.38284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203894.38287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203894.38290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203894.38297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203894.38329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203894.38380: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203894.38385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203894.38388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203894.38390: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203894.38392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203894.38441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203894.38444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203894.38585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203894.38691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203894.38836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203894.40793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203894.40871: stderr chunk (state=3): >>><<< 15896 1727203894.40877: stdout chunk (state=3): >>><<< 15896 1727203894.40880: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203894.40883: handler run complete 15896 1727203894.40927: attempt loop complete, returning result 15896 1727203894.40929: _execute() done 15896 1727203894.40932: dumping result to json 15896 1727203894.40933: done dumping result, returning 15896 1727203894.40935: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-fb83-b6ad-0000000000e4] 15896 1727203894.40937: sending task result for task 028d2410-947f-fb83-b6ad-0000000000e4 15896 1727203894.41159: done sending task result for task 028d2410-947f-fb83-b6ad-0000000000e4 15896 1727203894.41162: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 15896 1727203894.41227: no more pending results, returning what we have 15896 1727203894.41231: results queue empty 15896 1727203894.41232: checking for any_errors_fatal 15896 1727203894.41238: done checking for any_errors_fatal 15896 1727203894.41239: checking for max_fail_percentage 15896 1727203894.41240: done checking for max_fail_percentage 15896 1727203894.41241: checking to see if all hosts have failed and the running result is not ok 15896 1727203894.41241: done checking to see if all hosts have failed 15896 1727203894.41242: getting the remaining hosts for this loop 15896 1727203894.41244: done getting the remaining hosts for this loop 15896 1727203894.41246: getting the next task for host managed-node1 15896 1727203894.41256: done getting next task for host managed-node1 15896 1727203894.41258: ^ task is: TASK: meta (role_complete) 15896 1727203894.41261: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203894.41275: getting variables 15896 1727203894.41278: in VariableManager get_vars() 15896 1727203894.41331: Calling all_inventory to load vars for managed-node1 15896 1727203894.41334: Calling groups_inventory to load vars for managed-node1 15896 1727203894.41336: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203894.41345: Calling all_plugins_play to load vars for managed-node1 15896 1727203894.41348: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203894.41350: Calling groups_plugins_play to load vars for managed-node1 15896 1727203894.43085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203894.44982: done with get_vars() 15896 1727203894.45008: done getting variables 15896 1727203894.45152: done queuing things up, now waiting for results queue to drain 15896 1727203894.45154: results queue empty 15896 1727203894.45155: checking for any_errors_fatal 15896 1727203894.45158: done checking for any_errors_fatal 15896 1727203894.45159: checking for max_fail_percentage 15896 1727203894.45160: done checking for max_fail_percentage 15896 1727203894.45161: checking to see if all hosts have failed and the running result is not ok 15896 1727203894.45161: done checking to see if all hosts have failed 15896 1727203894.45162: getting the remaining hosts for this loop 15896 1727203894.45163: done getting the remaining hosts for this loop 15896 1727203894.45166: getting the next task for host managed-node1 15896 1727203894.45171: done getting next task for host managed-node1 15896 1727203894.45176: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203894.45178: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203894.45199: getting variables 15896 1727203894.45201: in VariableManager get_vars() 15896 1727203894.45226: Calling all_inventory to load vars for managed-node1 15896 1727203894.45228: Calling groups_inventory to load vars for managed-node1 15896 1727203894.45230: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203894.45236: Calling all_plugins_play to load vars for managed-node1 15896 1727203894.45238: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203894.45241: Calling groups_plugins_play to load vars for managed-node1 15896 1727203894.47728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203894.49884: done with get_vars() 15896 1727203894.49908: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:51:34 -0400 (0:00:00.477) 0:00:40.089 ***** 15896 1727203894.50001: entering _queue_task() for managed-node1/include_tasks 15896 1727203894.51301: worker is 1 (out of 1 available) 15896 1727203894.51313: exiting _queue_task() for managed-node1/include_tasks 15896 1727203894.51324: done queuing things up, now waiting for results queue to drain 15896 1727203894.51325: waiting for pending results... 15896 1727203894.51843: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203894.51972: in run() - task 028d2410-947f-fb83-b6ad-00000000011b 15896 1727203894.52003: variable 'ansible_search_path' from source: unknown 15896 1727203894.52011: variable 'ansible_search_path' from source: unknown 15896 1727203894.52052: calling self._execute() 15896 1727203894.52166: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203894.52180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203894.52195: variable 'omit' from source: magic vars 15896 1727203894.52615: variable 'ansible_distribution_major_version' from source: facts 15896 1727203894.52634: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203894.52655: _execute() done 15896 1727203894.52668: dumping result to json 15896 1727203894.52678: done dumping result, returning 15896 1727203894.52690: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-fb83-b6ad-00000000011b] 15896 1727203894.52699: sending task result for task 028d2410-947f-fb83-b6ad-00000000011b 15896 1727203894.52916: done sending task result for task 028d2410-947f-fb83-b6ad-00000000011b 15896 1727203894.52920: WORKER PROCESS EXITING 15896 1727203894.52965: no more pending results, returning what we have 15896 1727203894.52970: in VariableManager get_vars() 15896 1727203894.53031: Calling all_inventory to load vars for managed-node1 15896 1727203894.53034: Calling groups_inventory to load vars for managed-node1 15896 1727203894.53036: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203894.53047: Calling all_plugins_play to load vars for managed-node1 15896 1727203894.53050: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203894.53053: Calling groups_plugins_play to load vars for managed-node1 15896 1727203894.55001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203894.56500: done with get_vars() 15896 1727203894.56523: variable 'ansible_search_path' from source: unknown 15896 1727203894.56525: variable 'ansible_search_path' from source: unknown 15896 1727203894.56565: we have included files to process 15896 1727203894.56566: generating all_blocks data 15896 1727203894.56568: done generating all_blocks data 15896 1727203894.56574: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203894.56577: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203894.56580: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203894.57131: done processing included file 15896 1727203894.57133: iterating over new_blocks loaded from include file 15896 1727203894.57135: in VariableManager get_vars() 15896 1727203894.57167: done with get_vars() 15896 1727203894.57170: filtering new block on tags 15896 1727203894.57189: done filtering new block on tags 15896 1727203894.57192: in VariableManager get_vars() 15896 1727203894.57223: done with get_vars() 15896 1727203894.57224: filtering new block on tags 15896 1727203894.57244: done filtering new block on tags 15896 1727203894.57246: in VariableManager get_vars() 15896 1727203894.57274: done with get_vars() 15896 1727203894.57278: filtering new block on tags 15896 1727203894.57297: done filtering new block on tags 15896 1727203894.57299: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 15896 1727203894.57304: extending task lists for all hosts with included blocks 15896 1727203894.58255: done extending task lists 15896 1727203894.58257: done processing included files 15896 1727203894.58258: results queue empty 15896 1727203894.58258: checking for any_errors_fatal 15896 1727203894.58260: done checking for any_errors_fatal 15896 1727203894.58261: checking for max_fail_percentage 15896 1727203894.58262: done checking for max_fail_percentage 15896 1727203894.58262: checking to see if all hosts have failed and the running result is not ok 15896 1727203894.58263: done checking to see if all hosts have failed 15896 1727203894.58264: getting the remaining hosts for this loop 15896 1727203894.58265: done getting the remaining hosts for this loop 15896 1727203894.58268: getting the next task for host managed-node1 15896 1727203894.58272: done getting next task for host managed-node1 15896 1727203894.58275: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203894.58481: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203894.58493: getting variables 15896 1727203894.58494: in VariableManager get_vars() 15896 1727203894.58517: Calling all_inventory to load vars for managed-node1 15896 1727203894.58519: Calling groups_inventory to load vars for managed-node1 15896 1727203894.58521: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203894.58526: Calling all_plugins_play to load vars for managed-node1 15896 1727203894.58528: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203894.58531: Calling groups_plugins_play to load vars for managed-node1 15896 1727203894.61057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203894.63333: done with get_vars() 15896 1727203894.63358: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:51:34 -0400 (0:00:00.134) 0:00:40.223 ***** 15896 1727203894.63437: entering _queue_task() for managed-node1/setup 15896 1727203894.63837: worker is 1 (out of 1 available) 15896 1727203894.63849: exiting _queue_task() for managed-node1/setup 15896 1727203894.63861: done queuing things up, now waiting for results queue to drain 15896 1727203894.63863: waiting for pending results... 15896 1727203894.64187: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203894.64331: in run() - task 028d2410-947f-fb83-b6ad-00000000084f 15896 1727203894.64345: variable 'ansible_search_path' from source: unknown 15896 1727203894.64350: variable 'ansible_search_path' from source: unknown 15896 1727203894.64486: calling self._execute() 15896 1727203894.64490: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203894.64493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203894.64500: variable 'omit' from source: magic vars 15896 1727203894.64882: variable 'ansible_distribution_major_version' from source: facts 15896 1727203894.64893: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203894.65103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203894.67486: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203894.67501: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203894.67537: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203894.67570: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203894.67620: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203894.67700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203894.67730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203894.67755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203894.67809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203894.67829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203894.68048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203894.68052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203894.68055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203894.68057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203894.68062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203894.68273: variable '__network_required_facts' from source: role '' defaults 15896 1727203894.68283: variable 'ansible_facts' from source: unknown 15896 1727203894.69130: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15896 1727203894.69134: when evaluation is False, skipping this task 15896 1727203894.69137: _execute() done 15896 1727203894.69233: dumping result to json 15896 1727203894.69242: done dumping result, returning 15896 1727203894.69245: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-fb83-b6ad-00000000084f] 15896 1727203894.69250: sending task result for task 028d2410-947f-fb83-b6ad-00000000084f 15896 1727203894.69316: done sending task result for task 028d2410-947f-fb83-b6ad-00000000084f 15896 1727203894.69319: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203894.69394: no more pending results, returning what we have 15896 1727203894.69398: results queue empty 15896 1727203894.69399: checking for any_errors_fatal 15896 1727203894.69401: done checking for any_errors_fatal 15896 1727203894.69402: checking for max_fail_percentage 15896 1727203894.69403: done checking for max_fail_percentage 15896 1727203894.69404: checking to see if all hosts have failed and the running result is not ok 15896 1727203894.69405: done checking to see if all hosts have failed 15896 1727203894.69406: getting the remaining hosts for this loop 15896 1727203894.69408: done getting the remaining hosts for this loop 15896 1727203894.69412: getting the next task for host managed-node1 15896 1727203894.69422: done getting next task for host managed-node1 15896 1727203894.69426: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203894.69430: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203894.69449: getting variables 15896 1727203894.69451: in VariableManager get_vars() 15896 1727203894.69511: Calling all_inventory to load vars for managed-node1 15896 1727203894.69514: Calling groups_inventory to load vars for managed-node1 15896 1727203894.69517: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203894.69528: Calling all_plugins_play to load vars for managed-node1 15896 1727203894.69532: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203894.69535: Calling groups_plugins_play to load vars for managed-node1 15896 1727203894.71023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203894.74289: done with get_vars() 15896 1727203894.74319: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:51:34 -0400 (0:00:00.111) 0:00:40.335 ***** 15896 1727203894.74631: entering _queue_task() for managed-node1/stat 15896 1727203894.75627: worker is 1 (out of 1 available) 15896 1727203894.75641: exiting _queue_task() for managed-node1/stat 15896 1727203894.75654: done queuing things up, now waiting for results queue to drain 15896 1727203894.75656: waiting for pending results... 15896 1727203894.76592: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203894.76849: in run() - task 028d2410-947f-fb83-b6ad-000000000851 15896 1727203894.76865: variable 'ansible_search_path' from source: unknown 15896 1727203894.76869: variable 'ansible_search_path' from source: unknown 15896 1727203894.76903: calling self._execute() 15896 1727203894.77262: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203894.77267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203894.77270: variable 'omit' from source: magic vars 15896 1727203894.77969: variable 'ansible_distribution_major_version' from source: facts 15896 1727203894.77982: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203894.78383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203894.78847: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203894.78893: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203894.79116: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203894.79152: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203894.79265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203894.79289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203894.79473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203894.79479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203894.79809: variable '__network_is_ostree' from source: set_fact 15896 1727203894.79816: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203894.79819: when evaluation is False, skipping this task 15896 1727203894.79822: _execute() done 15896 1727203894.79824: dumping result to json 15896 1727203894.79827: done dumping result, returning 15896 1727203894.79836: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-fb83-b6ad-000000000851] 15896 1727203894.79842: sending task result for task 028d2410-947f-fb83-b6ad-000000000851 15896 1727203894.80178: done sending task result for task 028d2410-947f-fb83-b6ad-000000000851 15896 1727203894.80182: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203894.80294: no more pending results, returning what we have 15896 1727203894.80298: results queue empty 15896 1727203894.80299: checking for any_errors_fatal 15896 1727203894.80307: done checking for any_errors_fatal 15896 1727203894.80308: checking for max_fail_percentage 15896 1727203894.80310: done checking for max_fail_percentage 15896 1727203894.80311: checking to see if all hosts have failed and the running result is not ok 15896 1727203894.80312: done checking to see if all hosts have failed 15896 1727203894.80313: getting the remaining hosts for this loop 15896 1727203894.80315: done getting the remaining hosts for this loop 15896 1727203894.80318: getting the next task for host managed-node1 15896 1727203894.80326: done getting next task for host managed-node1 15896 1727203894.80330: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203894.80334: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203894.80354: getting variables 15896 1727203894.80356: in VariableManager get_vars() 15896 1727203894.80419: Calling all_inventory to load vars for managed-node1 15896 1727203894.80421: Calling groups_inventory to load vars for managed-node1 15896 1727203894.80424: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203894.80436: Calling all_plugins_play to load vars for managed-node1 15896 1727203894.80439: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203894.80442: Calling groups_plugins_play to load vars for managed-node1 15896 1727203894.84195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203894.86810: done with get_vars() 15896 1727203894.86848: done getting variables 15896 1727203894.86917: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:51:34 -0400 (0:00:00.123) 0:00:40.458 ***** 15896 1727203894.86966: entering _queue_task() for managed-node1/set_fact 15896 1727203894.87838: worker is 1 (out of 1 available) 15896 1727203894.88078: exiting _queue_task() for managed-node1/set_fact 15896 1727203894.88090: done queuing things up, now waiting for results queue to drain 15896 1727203894.88092: waiting for pending results... 15896 1727203894.88594: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203894.88941: in run() - task 028d2410-947f-fb83-b6ad-000000000852 15896 1727203894.89001: variable 'ansible_search_path' from source: unknown 15896 1727203894.89010: variable 'ansible_search_path' from source: unknown 15896 1727203894.89214: calling self._execute() 15896 1727203894.89484: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203894.89488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203894.89508: variable 'omit' from source: magic vars 15896 1727203894.90343: variable 'ansible_distribution_major_version' from source: facts 15896 1727203894.90502: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203894.90768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203894.91523: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203894.91578: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203894.91807: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203894.91811: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203894.92006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203894.92094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203894.92174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203894.92315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203894.92451: variable '__network_is_ostree' from source: set_fact 15896 1727203894.92510: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203894.92519: when evaluation is False, skipping this task 15896 1727203894.92530: _execute() done 15896 1727203894.92610: dumping result to json 15896 1727203894.92613: done dumping result, returning 15896 1727203894.92641: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-fb83-b6ad-000000000852] 15896 1727203894.92645: sending task result for task 028d2410-947f-fb83-b6ad-000000000852 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203894.92916: no more pending results, returning what we have 15896 1727203894.92920: results queue empty 15896 1727203894.92920: checking for any_errors_fatal 15896 1727203894.92934: done checking for any_errors_fatal 15896 1727203894.92935: checking for max_fail_percentage 15896 1727203894.92937: done checking for max_fail_percentage 15896 1727203894.92938: checking to see if all hosts have failed and the running result is not ok 15896 1727203894.92938: done checking to see if all hosts have failed 15896 1727203894.92939: getting the remaining hosts for this loop 15896 1727203894.92941: done getting the remaining hosts for this loop 15896 1727203894.92944: getting the next task for host managed-node1 15896 1727203894.92954: done getting next task for host managed-node1 15896 1727203894.92958: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203894.93077: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203894.93101: getting variables 15896 1727203894.93103: in VariableManager get_vars() 15896 1727203894.93419: Calling all_inventory to load vars for managed-node1 15896 1727203894.93422: Calling groups_inventory to load vars for managed-node1 15896 1727203894.93425: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203894.93436: Calling all_plugins_play to load vars for managed-node1 15896 1727203894.93439: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203894.93443: Calling groups_plugins_play to load vars for managed-node1 15896 1727203894.94022: done sending task result for task 028d2410-947f-fb83-b6ad-000000000852 15896 1727203894.94026: WORKER PROCESS EXITING 15896 1727203894.95650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203894.99344: done with get_vars() 15896 1727203894.99377: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:51:34 -0400 (0:00:00.125) 0:00:40.584 ***** 15896 1727203894.99558: entering _queue_task() for managed-node1/service_facts 15896 1727203895.00052: worker is 1 (out of 1 available) 15896 1727203895.00064: exiting _queue_task() for managed-node1/service_facts 15896 1727203895.00077: done queuing things up, now waiting for results queue to drain 15896 1727203895.00078: waiting for pending results... 15896 1727203895.00670: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203895.01148: in run() - task 028d2410-947f-fb83-b6ad-000000000854 15896 1727203895.01152: variable 'ansible_search_path' from source: unknown 15896 1727203895.01155: variable 'ansible_search_path' from source: unknown 15896 1727203895.01189: calling self._execute() 15896 1727203895.01474: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203895.01479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203895.01482: variable 'omit' from source: magic vars 15896 1727203895.02091: variable 'ansible_distribution_major_version' from source: facts 15896 1727203895.02109: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203895.02120: variable 'omit' from source: magic vars 15896 1727203895.02215: variable 'omit' from source: magic vars 15896 1727203895.02265: variable 'omit' from source: magic vars 15896 1727203895.02320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203895.02366: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203895.02393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203895.02464: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203895.02485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203895.02533: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203895.02555: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203895.02626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203895.02689: Set connection var ansible_shell_type to sh 15896 1727203895.02726: Set connection var ansible_connection to ssh 15896 1727203895.02739: Set connection var ansible_shell_executable to /bin/sh 15896 1727203895.02785: Set connection var ansible_pipelining to False 15896 1727203895.02788: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203895.02792: Set connection var ansible_timeout to 10 15896 1727203895.02800: variable 'ansible_shell_executable' from source: unknown 15896 1727203895.02808: variable 'ansible_connection' from source: unknown 15896 1727203895.02815: variable 'ansible_module_compression' from source: unknown 15896 1727203895.02822: variable 'ansible_shell_type' from source: unknown 15896 1727203895.02829: variable 'ansible_shell_executable' from source: unknown 15896 1727203895.02842: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203895.02896: variable 'ansible_pipelining' from source: unknown 15896 1727203895.02899: variable 'ansible_timeout' from source: unknown 15896 1727203895.02901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203895.03099: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203895.03129: variable 'omit' from source: magic vars 15896 1727203895.03140: starting attempt loop 15896 1727203895.03147: running the handler 15896 1727203895.03221: _low_level_execute_command(): starting 15896 1727203895.03225: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203895.04692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203895.04772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203895.04794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203895.04915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203895.06716: stdout chunk (state=3): >>>/root <<< 15896 1727203895.07011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203895.07015: stdout chunk (state=3): >>><<< 15896 1727203895.07018: stderr chunk (state=3): >>><<< 15896 1727203895.07274: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203895.07279: _low_level_execute_command(): starting 15896 1727203895.07282: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279 `" && echo ansible-tmp-1727203895.071081-18756-54930784542279="` echo /root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279 `" ) && sleep 0' 15896 1727203895.08614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203895.08897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203895.08950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203895.09193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203895.11212: stdout chunk (state=3): >>>ansible-tmp-1727203895.071081-18756-54930784542279=/root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279 <<< 15896 1727203895.11415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203895.11418: stdout chunk (state=3): >>><<< 15896 1727203895.11421: stderr chunk (state=3): >>><<< 15896 1727203895.11684: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203895.071081-18756-54930784542279=/root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203895.11687: variable 'ansible_module_compression' from source: unknown 15896 1727203895.11690: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15896 1727203895.11692: variable 'ansible_facts' from source: unknown 15896 1727203895.11797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/AnsiballZ_service_facts.py 15896 1727203895.11945: Sending initial data 15896 1727203895.11948: Sent initial data (160 bytes) 15896 1727203895.12420: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203895.12423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203895.12425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203895.12428: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203895.12430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203895.12480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203895.12496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203895.12568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203895.14431: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203895.14510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203895.14608: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp53uvyxir /root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/AnsiballZ_service_facts.py <<< 15896 1727203895.14611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/AnsiballZ_service_facts.py" <<< 15896 1727203895.14673: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp53uvyxir" to remote "/root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/AnsiballZ_service_facts.py" <<< 15896 1727203895.15748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203895.15820: stderr chunk (state=3): >>><<< 15896 1727203895.15823: stdout chunk (state=3): >>><<< 15896 1727203895.15845: done transferring module to remote 15896 1727203895.15856: _low_level_execute_command(): starting 15896 1727203895.15859: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/ /root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/AnsiballZ_service_facts.py && sleep 0' 15896 1727203895.16392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203895.16395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203895.16399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203895.16402: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203895.16409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203895.16482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203895.16510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203895.16606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203895.18781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203895.18785: stdout chunk (state=3): >>><<< 15896 1727203895.18787: stderr chunk (state=3): >>><<< 15896 1727203895.18790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203895.18792: _low_level_execute_command(): starting 15896 1727203895.18795: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/AnsiballZ_service_facts.py && sleep 0' 15896 1727203895.19400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203895.19482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203895.19486: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203895.19488: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203895.19682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203895.19686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203895.19689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203895.19691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203895.19746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203896.95949: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 15896 1727203896.95965: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15896 1727203896.97680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203896.97684: stdout chunk (state=3): >>><<< 15896 1727203896.97693: stderr chunk (state=3): >>><<< 15896 1727203896.97728: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203896.99318: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203896.99328: _low_level_execute_command(): starting 15896 1727203896.99334: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203895.071081-18756-54930784542279/ > /dev/null 2>&1 && sleep 0' 15896 1727203897.00944: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203897.00957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203897.00984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203897.00988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203897.01270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203897.03231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203897.03319: stderr chunk (state=3): >>><<< 15896 1727203897.03322: stdout chunk (state=3): >>><<< 15896 1727203897.03536: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203897.03540: handler run complete 15896 1727203897.03962: variable 'ansible_facts' from source: unknown 15896 1727203897.04371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203897.05394: variable 'ansible_facts' from source: unknown 15896 1727203897.05643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203897.06388: attempt loop complete, returning result 15896 1727203897.06393: _execute() done 15896 1727203897.06396: dumping result to json 15896 1727203897.06457: done dumping result, returning 15896 1727203897.06896: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-fb83-b6ad-000000000854] 15896 1727203897.06899: sending task result for task 028d2410-947f-fb83-b6ad-000000000854 15896 1727203897.08595: done sending task result for task 028d2410-947f-fb83-b6ad-000000000854 15896 1727203897.08598: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203897.08698: no more pending results, returning what we have 15896 1727203897.08701: results queue empty 15896 1727203897.08702: checking for any_errors_fatal 15896 1727203897.08706: done checking for any_errors_fatal 15896 1727203897.08707: checking for max_fail_percentage 15896 1727203897.08785: done checking for max_fail_percentage 15896 1727203897.08786: checking to see if all hosts have failed and the running result is not ok 15896 1727203897.08787: done checking to see if all hosts have failed 15896 1727203897.08788: getting the remaining hosts for this loop 15896 1727203897.08789: done getting the remaining hosts for this loop 15896 1727203897.08794: getting the next task for host managed-node1 15896 1727203897.08799: done getting next task for host managed-node1 15896 1727203897.08803: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203897.08807: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203897.08865: getting variables 15896 1727203897.08867: in VariableManager get_vars() 15896 1727203897.08962: Calling all_inventory to load vars for managed-node1 15896 1727203897.08965: Calling groups_inventory to load vars for managed-node1 15896 1727203897.08967: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203897.08979: Calling all_plugins_play to load vars for managed-node1 15896 1727203897.08982: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203897.08984: Calling groups_plugins_play to load vars for managed-node1 15896 1727203897.13203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203897.16818: done with get_vars() 15896 1727203897.16851: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:51:37 -0400 (0:00:02.175) 0:00:42.760 ***** 15896 1727203897.17092: entering _queue_task() for managed-node1/package_facts 15896 1727203897.17985: worker is 1 (out of 1 available) 15896 1727203897.17997: exiting _queue_task() for managed-node1/package_facts 15896 1727203897.18008: done queuing things up, now waiting for results queue to drain 15896 1727203897.18009: waiting for pending results... 15896 1727203897.18797: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203897.18855: in run() - task 028d2410-947f-fb83-b6ad-000000000855 15896 1727203897.19304: variable 'ansible_search_path' from source: unknown 15896 1727203897.19308: variable 'ansible_search_path' from source: unknown 15896 1727203897.19311: calling self._execute() 15896 1727203897.19490: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203897.19503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203897.19517: variable 'omit' from source: magic vars 15896 1727203897.20157: variable 'ansible_distribution_major_version' from source: facts 15896 1727203897.20496: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203897.20508: variable 'omit' from source: magic vars 15896 1727203897.20718: variable 'omit' from source: magic vars 15896 1727203897.20761: variable 'omit' from source: magic vars 15896 1727203897.20869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203897.20968: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203897.21068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203897.21094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203897.21165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203897.21207: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203897.21215: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203897.21222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203897.21583: Set connection var ansible_shell_type to sh 15896 1727203897.21586: Set connection var ansible_connection to ssh 15896 1727203897.21588: Set connection var ansible_shell_executable to /bin/sh 15896 1727203897.21590: Set connection var ansible_pipelining to False 15896 1727203897.21592: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203897.21594: Set connection var ansible_timeout to 10 15896 1727203897.21596: variable 'ansible_shell_executable' from source: unknown 15896 1727203897.21598: variable 'ansible_connection' from source: unknown 15896 1727203897.21600: variable 'ansible_module_compression' from source: unknown 15896 1727203897.21602: variable 'ansible_shell_type' from source: unknown 15896 1727203897.21604: variable 'ansible_shell_executable' from source: unknown 15896 1727203897.21607: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203897.21609: variable 'ansible_pipelining' from source: unknown 15896 1727203897.21611: variable 'ansible_timeout' from source: unknown 15896 1727203897.21613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203897.22581: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203897.22585: variable 'omit' from source: magic vars 15896 1727203897.22587: starting attempt loop 15896 1727203897.22590: running the handler 15896 1727203897.22592: _low_level_execute_command(): starting 15896 1727203897.22593: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203897.24355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203897.24370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203897.24809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203897.26756: stdout chunk (state=3): >>>/root <<< 15896 1727203897.26781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203897.26784: stdout chunk (state=3): >>><<< 15896 1727203897.26801: stderr chunk (state=3): >>><<< 15896 1727203897.26884: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203897.26889: _low_level_execute_command(): starting 15896 1727203897.26892: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496 `" && echo ansible-tmp-1727203897.2681863-18891-141712277454496="` echo /root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496 `" ) && sleep 0' 15896 1727203897.27981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203897.28083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203897.28126: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203897.28238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203897.28257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203897.28309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203897.28534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203897.30709: stdout chunk (state=3): >>>ansible-tmp-1727203897.2681863-18891-141712277454496=/root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496 <<< 15896 1727203897.31381: stdout chunk (state=3): >>><<< 15896 1727203897.31384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203897.31386: stderr chunk (state=3): >>><<< 15896 1727203897.31388: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203897.2681863-18891-141712277454496=/root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203897.31390: variable 'ansible_module_compression' from source: unknown 15896 1727203897.31392: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15896 1727203897.31393: variable 'ansible_facts' from source: unknown 15896 1727203897.31778: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/AnsiballZ_package_facts.py 15896 1727203897.32916: Sending initial data 15896 1727203897.32919: Sent initial data (162 bytes) 15896 1727203897.33741: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203897.33745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203897.33943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203897.34006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203897.34219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203897.35959: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203897.36401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/AnsiballZ_package_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpuu9d9jy_" to remote "/root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/AnsiballZ_package_facts.py" <<< 15896 1727203897.36405: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpuu9d9jy_ /root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/AnsiballZ_package_facts.py <<< 15896 1727203897.38922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203897.38987: stderr chunk (state=3): >>><<< 15896 1727203897.38995: stdout chunk (state=3): >>><<< 15896 1727203897.39017: done transferring module to remote 15896 1727203897.39033: _low_level_execute_command(): starting 15896 1727203897.39045: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/ /root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/AnsiballZ_package_facts.py && sleep 0' 15896 1727203897.40395: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203897.40410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203897.40520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203897.42505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203897.42514: stdout chunk (state=3): >>><<< 15896 1727203897.42523: stderr chunk (state=3): >>><<< 15896 1727203897.42545: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203897.42561: _low_level_execute_command(): starting 15896 1727203897.42572: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/AnsiballZ_package_facts.py && sleep 0' 15896 1727203897.43647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203897.43694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203897.43712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203897.43980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203897.44022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203897.44109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203897.91130: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 15896 1727203897.91272: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 15896 1727203897.91373: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15896 1727203897.93456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203897.93460: stdout chunk (state=3): >>><<< 15896 1727203897.93471: stderr chunk (state=3): >>><<< 15896 1727203897.93790: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203897.96418: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203897.96645: _low_level_execute_command(): starting 15896 1727203897.96651: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203897.2681863-18891-141712277454496/ > /dev/null 2>&1 && sleep 0' 15896 1727203897.97334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203897.97337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203897.97340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203897.97349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203897.97365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203897.97390: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203897.97399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203897.97428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203897.97550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203897.97554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203897.97644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203897.99649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203897.99653: stdout chunk (state=3): >>><<< 15896 1727203897.99662: stderr chunk (state=3): >>><<< 15896 1727203897.99680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203897.99686: handler run complete 15896 1727203898.00791: variable 'ansible_facts' from source: unknown 15896 1727203898.01465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203898.03477: variable 'ansible_facts' from source: unknown 15896 1727203898.03931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203898.04921: attempt loop complete, returning result 15896 1727203898.04943: _execute() done 15896 1727203898.04953: dumping result to json 15896 1727203898.05232: done dumping result, returning 15896 1727203898.05235: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-fb83-b6ad-000000000855] 15896 1727203898.05238: sending task result for task 028d2410-947f-fb83-b6ad-000000000855 15896 1727203898.09563: done sending task result for task 028d2410-947f-fb83-b6ad-000000000855 15896 1727203898.09566: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203898.09699: no more pending results, returning what we have 15896 1727203898.09702: results queue empty 15896 1727203898.09703: checking for any_errors_fatal 15896 1727203898.09709: done checking for any_errors_fatal 15896 1727203898.09710: checking for max_fail_percentage 15896 1727203898.09712: done checking for max_fail_percentage 15896 1727203898.09712: checking to see if all hosts have failed and the running result is not ok 15896 1727203898.09713: done checking to see if all hosts have failed 15896 1727203898.09714: getting the remaining hosts for this loop 15896 1727203898.09715: done getting the remaining hosts for this loop 15896 1727203898.09719: getting the next task for host managed-node1 15896 1727203898.09727: done getting next task for host managed-node1 15896 1727203898.09730: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203898.09733: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203898.09744: getting variables 15896 1727203898.09746: in VariableManager get_vars() 15896 1727203898.10110: Calling all_inventory to load vars for managed-node1 15896 1727203898.10113: Calling groups_inventory to load vars for managed-node1 15896 1727203898.10115: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203898.10125: Calling all_plugins_play to load vars for managed-node1 15896 1727203898.10128: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203898.10131: Calling groups_plugins_play to load vars for managed-node1 15896 1727203898.12440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203898.14907: done with get_vars() 15896 1727203898.14930: done getting variables 15896 1727203898.14991: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:51:38 -0400 (0:00:00.979) 0:00:43.739 ***** 15896 1727203898.15036: entering _queue_task() for managed-node1/debug 15896 1727203898.15682: worker is 1 (out of 1 available) 15896 1727203898.15698: exiting _queue_task() for managed-node1/debug 15896 1727203898.15708: done queuing things up, now waiting for results queue to drain 15896 1727203898.15710: waiting for pending results... 15896 1727203898.16105: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203898.16109: in run() - task 028d2410-947f-fb83-b6ad-00000000011c 15896 1727203898.16117: variable 'ansible_search_path' from source: unknown 15896 1727203898.16120: variable 'ansible_search_path' from source: unknown 15896 1727203898.16166: calling self._execute() 15896 1727203898.16570: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203898.16574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203898.16580: variable 'omit' from source: magic vars 15896 1727203898.16882: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.16888: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203898.16895: variable 'omit' from source: magic vars 15896 1727203898.16949: variable 'omit' from source: magic vars 15896 1727203898.17267: variable 'network_provider' from source: set_fact 15896 1727203898.17290: variable 'omit' from source: magic vars 15896 1727203898.17329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203898.17361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203898.17387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203898.17409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203898.17417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203898.17448: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203898.17451: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203898.17455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203898.17813: Set connection var ansible_shell_type to sh 15896 1727203898.17816: Set connection var ansible_connection to ssh 15896 1727203898.17818: Set connection var ansible_shell_executable to /bin/sh 15896 1727203898.17820: Set connection var ansible_pipelining to False 15896 1727203898.17822: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203898.17825: Set connection var ansible_timeout to 10 15896 1727203898.17827: variable 'ansible_shell_executable' from source: unknown 15896 1727203898.17829: variable 'ansible_connection' from source: unknown 15896 1727203898.17831: variable 'ansible_module_compression' from source: unknown 15896 1727203898.17833: variable 'ansible_shell_type' from source: unknown 15896 1727203898.17835: variable 'ansible_shell_executable' from source: unknown 15896 1727203898.17837: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203898.17839: variable 'ansible_pipelining' from source: unknown 15896 1727203898.17841: variable 'ansible_timeout' from source: unknown 15896 1727203898.17843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203898.18173: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203898.18180: variable 'omit' from source: magic vars 15896 1727203898.18182: starting attempt loop 15896 1727203898.18186: running the handler 15896 1727203898.18187: handler run complete 15896 1727203898.18189: attempt loop complete, returning result 15896 1727203898.18191: _execute() done 15896 1727203898.18193: dumping result to json 15896 1727203898.18198: done dumping result, returning 15896 1727203898.18200: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-fb83-b6ad-00000000011c] 15896 1727203898.18201: sending task result for task 028d2410-947f-fb83-b6ad-00000000011c 15896 1727203898.18268: done sending task result for task 028d2410-947f-fb83-b6ad-00000000011c 15896 1727203898.18271: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 15896 1727203898.18362: no more pending results, returning what we have 15896 1727203898.18365: results queue empty 15896 1727203898.18366: checking for any_errors_fatal 15896 1727203898.18373: done checking for any_errors_fatal 15896 1727203898.18374: checking for max_fail_percentage 15896 1727203898.18377: done checking for max_fail_percentage 15896 1727203898.18378: checking to see if all hosts have failed and the running result is not ok 15896 1727203898.18379: done checking to see if all hosts have failed 15896 1727203898.18379: getting the remaining hosts for this loop 15896 1727203898.18381: done getting the remaining hosts for this loop 15896 1727203898.18384: getting the next task for host managed-node1 15896 1727203898.18391: done getting next task for host managed-node1 15896 1727203898.18395: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203898.18398: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203898.18410: getting variables 15896 1727203898.18411: in VariableManager get_vars() 15896 1727203898.18472: Calling all_inventory to load vars for managed-node1 15896 1727203898.18580: Calling groups_inventory to load vars for managed-node1 15896 1727203898.18584: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203898.18593: Calling all_plugins_play to load vars for managed-node1 15896 1727203898.18595: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203898.18597: Calling groups_plugins_play to load vars for managed-node1 15896 1727203898.20331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203898.22489: done with get_vars() 15896 1727203898.22544: done getting variables 15896 1727203898.22619: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:51:38 -0400 (0:00:00.076) 0:00:43.815 ***** 15896 1727203898.22659: entering _queue_task() for managed-node1/fail 15896 1727203898.23125: worker is 1 (out of 1 available) 15896 1727203898.23137: exiting _queue_task() for managed-node1/fail 15896 1727203898.23149: done queuing things up, now waiting for results queue to drain 15896 1727203898.23150: waiting for pending results... 15896 1727203898.23518: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203898.23673: in run() - task 028d2410-947f-fb83-b6ad-00000000011d 15896 1727203898.23695: variable 'ansible_search_path' from source: unknown 15896 1727203898.23703: variable 'ansible_search_path' from source: unknown 15896 1727203898.23740: calling self._execute() 15896 1727203898.23895: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203898.23899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203898.23902: variable 'omit' from source: magic vars 15896 1727203898.24330: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.24347: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203898.24693: variable 'network_state' from source: role '' defaults 15896 1727203898.24697: Evaluated conditional (network_state != {}): False 15896 1727203898.24700: when evaluation is False, skipping this task 15896 1727203898.24702: _execute() done 15896 1727203898.24704: dumping result to json 15896 1727203898.24706: done dumping result, returning 15896 1727203898.24709: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-fb83-b6ad-00000000011d] 15896 1727203898.24711: sending task result for task 028d2410-947f-fb83-b6ad-00000000011d 15896 1727203898.24914: done sending task result for task 028d2410-947f-fb83-b6ad-00000000011d 15896 1727203898.24917: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203898.24966: no more pending results, returning what we have 15896 1727203898.24970: results queue empty 15896 1727203898.24971: checking for any_errors_fatal 15896 1727203898.24981: done checking for any_errors_fatal 15896 1727203898.24981: checking for max_fail_percentage 15896 1727203898.24984: done checking for max_fail_percentage 15896 1727203898.24985: checking to see if all hosts have failed and the running result is not ok 15896 1727203898.24985: done checking to see if all hosts have failed 15896 1727203898.24986: getting the remaining hosts for this loop 15896 1727203898.24988: done getting the remaining hosts for this loop 15896 1727203898.24991: getting the next task for host managed-node1 15896 1727203898.24998: done getting next task for host managed-node1 15896 1727203898.25002: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203898.25007: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203898.25096: getting variables 15896 1727203898.25098: in VariableManager get_vars() 15896 1727203898.25165: Calling all_inventory to load vars for managed-node1 15896 1727203898.25168: Calling groups_inventory to load vars for managed-node1 15896 1727203898.25171: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203898.25248: Calling all_plugins_play to load vars for managed-node1 15896 1727203898.25252: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203898.25256: Calling groups_plugins_play to load vars for managed-node1 15896 1727203898.28086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203898.35755: done with get_vars() 15896 1727203898.35788: done getting variables 15896 1727203898.35837: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:51:38 -0400 (0:00:00.132) 0:00:43.947 ***** 15896 1727203898.35868: entering _queue_task() for managed-node1/fail 15896 1727203898.36232: worker is 1 (out of 1 available) 15896 1727203898.36244: exiting _queue_task() for managed-node1/fail 15896 1727203898.36257: done queuing things up, now waiting for results queue to drain 15896 1727203898.36259: waiting for pending results... 15896 1727203898.36527: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203898.36733: in run() - task 028d2410-947f-fb83-b6ad-00000000011e 15896 1727203898.36737: variable 'ansible_search_path' from source: unknown 15896 1727203898.36740: variable 'ansible_search_path' from source: unknown 15896 1727203898.36765: calling self._execute() 15896 1727203898.36956: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203898.36963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203898.36966: variable 'omit' from source: magic vars 15896 1727203898.37999: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.38021: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203898.38780: variable 'network_state' from source: role '' defaults 15896 1727203898.38784: Evaluated conditional (network_state != {}): False 15896 1727203898.38787: when evaluation is False, skipping this task 15896 1727203898.38789: _execute() done 15896 1727203898.38792: dumping result to json 15896 1727203898.38794: done dumping result, returning 15896 1727203898.38797: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-fb83-b6ad-00000000011e] 15896 1727203898.38799: sending task result for task 028d2410-947f-fb83-b6ad-00000000011e skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203898.39140: no more pending results, returning what we have 15896 1727203898.39144: results queue empty 15896 1727203898.39145: checking for any_errors_fatal 15896 1727203898.39155: done checking for any_errors_fatal 15896 1727203898.39156: checking for max_fail_percentage 15896 1727203898.39159: done checking for max_fail_percentage 15896 1727203898.39159: checking to see if all hosts have failed and the running result is not ok 15896 1727203898.39160: done checking to see if all hosts have failed 15896 1727203898.39161: getting the remaining hosts for this loop 15896 1727203898.39162: done getting the remaining hosts for this loop 15896 1727203898.39166: getting the next task for host managed-node1 15896 1727203898.39177: done getting next task for host managed-node1 15896 1727203898.39183: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203898.39186: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203898.39210: getting variables 15896 1727203898.39212: in VariableManager get_vars() 15896 1727203898.39270: Calling all_inventory to load vars for managed-node1 15896 1727203898.39273: Calling groups_inventory to load vars for managed-node1 15896 1727203898.39277: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203898.39290: Calling all_plugins_play to load vars for managed-node1 15896 1727203898.39293: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203898.39296: Calling groups_plugins_play to load vars for managed-node1 15896 1727203898.39983: done sending task result for task 028d2410-947f-fb83-b6ad-00000000011e 15896 1727203898.40501: WORKER PROCESS EXITING 15896 1727203898.41598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203898.43138: done with get_vars() 15896 1727203898.43164: done getting variables 15896 1727203898.43228: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:51:38 -0400 (0:00:00.073) 0:00:44.021 ***** 15896 1727203898.43265: entering _queue_task() for managed-node1/fail 15896 1727203898.43625: worker is 1 (out of 1 available) 15896 1727203898.43636: exiting _queue_task() for managed-node1/fail 15896 1727203898.43648: done queuing things up, now waiting for results queue to drain 15896 1727203898.43650: waiting for pending results... 15896 1727203898.43996: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203898.44113: in run() - task 028d2410-947f-fb83-b6ad-00000000011f 15896 1727203898.44126: variable 'ansible_search_path' from source: unknown 15896 1727203898.44130: variable 'ansible_search_path' from source: unknown 15896 1727203898.44174: calling self._execute() 15896 1727203898.44295: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203898.44299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203898.44310: variable 'omit' from source: magic vars 15896 1727203898.44910: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.44914: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203898.45083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203898.47775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203898.48249: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203898.48311: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203898.48350: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203898.48388: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203898.48497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.48541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.48578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.48632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.48654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.48772: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.48796: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15896 1727203898.48928: variable 'ansible_distribution' from source: facts 15896 1727203898.48936: variable '__network_rh_distros' from source: role '' defaults 15896 1727203898.48956: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15896 1727203898.49271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.49274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.49279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.49312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.49329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.49384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.49409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.49434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.49485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.49502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.49541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.49569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.49681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.49685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.49689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.49997: variable 'network_connections' from source: task vars 15896 1727203898.50014: variable 'port1_profile' from source: play vars 15896 1727203898.50091: variable 'port1_profile' from source: play vars 15896 1727203898.50107: variable 'port2_profile' from source: play vars 15896 1727203898.50180: variable 'port2_profile' from source: play vars 15896 1727203898.50194: variable 'network_state' from source: role '' defaults 15896 1727203898.50278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203898.50470: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203898.50512: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203898.50547: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203898.50612: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203898.50683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203898.50789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203898.50794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.50892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203898.51009: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15896 1727203898.51012: when evaluation is False, skipping this task 15896 1727203898.51013: _execute() done 15896 1727203898.51015: dumping result to json 15896 1727203898.51017: done dumping result, returning 15896 1727203898.51019: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-fb83-b6ad-00000000011f] 15896 1727203898.51021: sending task result for task 028d2410-947f-fb83-b6ad-00000000011f 15896 1727203898.51100: done sending task result for task 028d2410-947f-fb83-b6ad-00000000011f 15896 1727203898.51104: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15896 1727203898.51146: no more pending results, returning what we have 15896 1727203898.51149: results queue empty 15896 1727203898.51150: checking for any_errors_fatal 15896 1727203898.51154: done checking for any_errors_fatal 15896 1727203898.51154: checking for max_fail_percentage 15896 1727203898.51156: done checking for max_fail_percentage 15896 1727203898.51157: checking to see if all hosts have failed and the running result is not ok 15896 1727203898.51157: done checking to see if all hosts have failed 15896 1727203898.51158: getting the remaining hosts for this loop 15896 1727203898.51159: done getting the remaining hosts for this loop 15896 1727203898.51163: getting the next task for host managed-node1 15896 1727203898.51169: done getting next task for host managed-node1 15896 1727203898.51173: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203898.51177: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203898.51194: getting variables 15896 1727203898.51196: in VariableManager get_vars() 15896 1727203898.51245: Calling all_inventory to load vars for managed-node1 15896 1727203898.51247: Calling groups_inventory to load vars for managed-node1 15896 1727203898.51249: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203898.51260: Calling all_plugins_play to load vars for managed-node1 15896 1727203898.51263: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203898.51265: Calling groups_plugins_play to load vars for managed-node1 15896 1727203898.52800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203898.54286: done with get_vars() 15896 1727203898.54311: done getting variables 15896 1727203898.54371: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:51:38 -0400 (0:00:00.111) 0:00:44.133 ***** 15896 1727203898.54409: entering _queue_task() for managed-node1/dnf 15896 1727203898.54769: worker is 1 (out of 1 available) 15896 1727203898.54982: exiting _queue_task() for managed-node1/dnf 15896 1727203898.54994: done queuing things up, now waiting for results queue to drain 15896 1727203898.54995: waiting for pending results... 15896 1727203898.55195: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203898.55253: in run() - task 028d2410-947f-fb83-b6ad-000000000120 15896 1727203898.55266: variable 'ansible_search_path' from source: unknown 15896 1727203898.55270: variable 'ansible_search_path' from source: unknown 15896 1727203898.55309: calling self._execute() 15896 1727203898.55427: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203898.55431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203898.55448: variable 'omit' from source: magic vars 15896 1727203898.55881: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.55887: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203898.56235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203898.59798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203898.59950: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203898.60082: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203898.60102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203898.60128: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203898.60282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.60285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.60288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.60355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.60358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.60783: variable 'ansible_distribution' from source: facts 15896 1727203898.60787: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.60790: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15896 1727203898.60792: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203898.60852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.60879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.60902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.60945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.60963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.61093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.61116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.61145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.61185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.61198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.61236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.61266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.61292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.61329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.61344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.61624: variable 'network_connections' from source: task vars 15896 1727203898.61628: variable 'port1_profile' from source: play vars 15896 1727203898.61630: variable 'port1_profile' from source: play vars 15896 1727203898.61639: variable 'port2_profile' from source: play vars 15896 1727203898.61702: variable 'port2_profile' from source: play vars 15896 1727203898.61768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203898.62231: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203898.62235: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203898.62237: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203898.62249: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203898.62294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203898.62315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203898.62397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.62423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203898.62544: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203898.62900: variable 'network_connections' from source: task vars 15896 1727203898.62903: variable 'port1_profile' from source: play vars 15896 1727203898.62968: variable 'port1_profile' from source: play vars 15896 1727203898.62972: variable 'port2_profile' from source: play vars 15896 1727203898.63241: variable 'port2_profile' from source: play vars 15896 1727203898.63266: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203898.63269: when evaluation is False, skipping this task 15896 1727203898.63272: _execute() done 15896 1727203898.63274: dumping result to json 15896 1727203898.63280: done dumping result, returning 15896 1727203898.63290: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000120] 15896 1727203898.63293: sending task result for task 028d2410-947f-fb83-b6ad-000000000120 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203898.63566: no more pending results, returning what we have 15896 1727203898.63570: results queue empty 15896 1727203898.63571: checking for any_errors_fatal 15896 1727203898.63580: done checking for any_errors_fatal 15896 1727203898.63581: checking for max_fail_percentage 15896 1727203898.63583: done checking for max_fail_percentage 15896 1727203898.63584: checking to see if all hosts have failed and the running result is not ok 15896 1727203898.63584: done checking to see if all hosts have failed 15896 1727203898.63585: getting the remaining hosts for this loop 15896 1727203898.63587: done getting the remaining hosts for this loop 15896 1727203898.63590: getting the next task for host managed-node1 15896 1727203898.63597: done getting next task for host managed-node1 15896 1727203898.63601: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203898.63604: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203898.63616: done sending task result for task 028d2410-947f-fb83-b6ad-000000000120 15896 1727203898.63621: WORKER PROCESS EXITING 15896 1727203898.63634: getting variables 15896 1727203898.63636: in VariableManager get_vars() 15896 1727203898.63704: Calling all_inventory to load vars for managed-node1 15896 1727203898.63707: Calling groups_inventory to load vars for managed-node1 15896 1727203898.63710: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203898.63721: Calling all_plugins_play to load vars for managed-node1 15896 1727203898.63725: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203898.63728: Calling groups_plugins_play to load vars for managed-node1 15896 1727203898.65841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203898.68095: done with get_vars() 15896 1727203898.68121: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203898.68373: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:51:38 -0400 (0:00:00.139) 0:00:44.273 ***** 15896 1727203898.68411: entering _queue_task() for managed-node1/yum 15896 1727203898.68988: worker is 1 (out of 1 available) 15896 1727203898.68999: exiting _queue_task() for managed-node1/yum 15896 1727203898.69010: done queuing things up, now waiting for results queue to drain 15896 1727203898.69012: waiting for pending results... 15896 1727203898.69124: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203898.69262: in run() - task 028d2410-947f-fb83-b6ad-000000000121 15896 1727203898.69287: variable 'ansible_search_path' from source: unknown 15896 1727203898.69295: variable 'ansible_search_path' from source: unknown 15896 1727203898.69336: calling self._execute() 15896 1727203898.69448: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203898.69466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203898.69483: variable 'omit' from source: magic vars 15896 1727203898.69890: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.69897: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203898.70107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203898.72657: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203898.72736: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203898.72809: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203898.72853: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203898.72932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203898.72977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.73012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.73045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.73295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.73299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.73301: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.73304: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15896 1727203898.73306: when evaluation is False, skipping this task 15896 1727203898.73308: _execute() done 15896 1727203898.73310: dumping result to json 15896 1727203898.73313: done dumping result, returning 15896 1727203898.73315: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000121] 15896 1727203898.73318: sending task result for task 028d2410-947f-fb83-b6ad-000000000121 15896 1727203898.73393: done sending task result for task 028d2410-947f-fb83-b6ad-000000000121 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15896 1727203898.73442: no more pending results, returning what we have 15896 1727203898.73446: results queue empty 15896 1727203898.73447: checking for any_errors_fatal 15896 1727203898.73453: done checking for any_errors_fatal 15896 1727203898.73454: checking for max_fail_percentage 15896 1727203898.73455: done checking for max_fail_percentage 15896 1727203898.73456: checking to see if all hosts have failed and the running result is not ok 15896 1727203898.73457: done checking to see if all hosts have failed 15896 1727203898.73458: getting the remaining hosts for this loop 15896 1727203898.73461: done getting the remaining hosts for this loop 15896 1727203898.73465: getting the next task for host managed-node1 15896 1727203898.73472: done getting next task for host managed-node1 15896 1727203898.73478: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203898.73481: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203898.73501: getting variables 15896 1727203898.73503: in VariableManager get_vars() 15896 1727203898.73558: Calling all_inventory to load vars for managed-node1 15896 1727203898.73563: Calling groups_inventory to load vars for managed-node1 15896 1727203898.73566: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203898.73578: Calling all_plugins_play to load vars for managed-node1 15896 1727203898.73581: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203898.73584: Calling groups_plugins_play to load vars for managed-node1 15896 1727203898.74492: WORKER PROCESS EXITING 15896 1727203898.77670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203898.80891: done with get_vars() 15896 1727203898.80921: done getting variables 15896 1727203898.81186: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:51:38 -0400 (0:00:00.128) 0:00:44.401 ***** 15896 1727203898.81222: entering _queue_task() for managed-node1/fail 15896 1727203898.82000: worker is 1 (out of 1 available) 15896 1727203898.82013: exiting _queue_task() for managed-node1/fail 15896 1727203898.82026: done queuing things up, now waiting for results queue to drain 15896 1727203898.82028: waiting for pending results... 15896 1727203898.82595: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203898.82902: in run() - task 028d2410-947f-fb83-b6ad-000000000122 15896 1727203898.82923: variable 'ansible_search_path' from source: unknown 15896 1727203898.82934: variable 'ansible_search_path' from source: unknown 15896 1727203898.82979: calling self._execute() 15896 1727203898.83290: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203898.83481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203898.83485: variable 'omit' from source: magic vars 15896 1727203898.84085: variable 'ansible_distribution_major_version' from source: facts 15896 1727203898.84104: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203898.84481: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203898.84632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203898.89025: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203898.89253: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203898.89681: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203898.89685: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203898.89688: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203898.89691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.89740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.89773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.90181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.90186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.90189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.90192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.90194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.90197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.90392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.90440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203898.90469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203898.90502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.90546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203898.90566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203898.90963: variable 'network_connections' from source: task vars 15896 1727203898.90984: variable 'port1_profile' from source: play vars 15896 1727203898.91059: variable 'port1_profile' from source: play vars 15896 1727203898.91380: variable 'port2_profile' from source: play vars 15896 1727203898.91383: variable 'port2_profile' from source: play vars 15896 1727203898.91434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203898.91827: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203898.91871: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203898.92180: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203898.92184: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203898.92196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203898.92224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203898.92256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203898.92289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203898.92580: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203898.92764: variable 'network_connections' from source: task vars 15896 1727203898.92989: variable 'port1_profile' from source: play vars 15896 1727203898.93055: variable 'port1_profile' from source: play vars 15896 1727203898.93069: variable 'port2_profile' from source: play vars 15896 1727203898.93135: variable 'port2_profile' from source: play vars 15896 1727203898.93407: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203898.93581: when evaluation is False, skipping this task 15896 1727203898.93584: _execute() done 15896 1727203898.93586: dumping result to json 15896 1727203898.93588: done dumping result, returning 15896 1727203898.93590: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000122] 15896 1727203898.93600: sending task result for task 028d2410-947f-fb83-b6ad-000000000122 15896 1727203898.93667: done sending task result for task 028d2410-947f-fb83-b6ad-000000000122 15896 1727203898.93670: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203898.93731: no more pending results, returning what we have 15896 1727203898.93734: results queue empty 15896 1727203898.93735: checking for any_errors_fatal 15896 1727203898.93740: done checking for any_errors_fatal 15896 1727203898.93741: checking for max_fail_percentage 15896 1727203898.93743: done checking for max_fail_percentage 15896 1727203898.93743: checking to see if all hosts have failed and the running result is not ok 15896 1727203898.93744: done checking to see if all hosts have failed 15896 1727203898.93745: getting the remaining hosts for this loop 15896 1727203898.93746: done getting the remaining hosts for this loop 15896 1727203898.93749: getting the next task for host managed-node1 15896 1727203898.93755: done getting next task for host managed-node1 15896 1727203898.93758: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15896 1727203898.93763: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203898.93781: getting variables 15896 1727203898.93783: in VariableManager get_vars() 15896 1727203898.93923: Calling all_inventory to load vars for managed-node1 15896 1727203898.93925: Calling groups_inventory to load vars for managed-node1 15896 1727203898.93928: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203898.93936: Calling all_plugins_play to load vars for managed-node1 15896 1727203898.93939: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203898.93942: Calling groups_plugins_play to load vars for managed-node1 15896 1727203898.97256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203899.00299: done with get_vars() 15896 1727203899.00323: done getting variables 15896 1727203899.00478: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:51:39 -0400 (0:00:00.192) 0:00:44.594 ***** 15896 1727203899.00516: entering _queue_task() for managed-node1/package 15896 1727203899.01165: worker is 1 (out of 1 available) 15896 1727203899.01181: exiting _queue_task() for managed-node1/package 15896 1727203899.01192: done queuing things up, now waiting for results queue to drain 15896 1727203899.01194: waiting for pending results... 15896 1727203899.01888: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 15896 1727203899.02254: in run() - task 028d2410-947f-fb83-b6ad-000000000123 15896 1727203899.02274: variable 'ansible_search_path' from source: unknown 15896 1727203899.02285: variable 'ansible_search_path' from source: unknown 15896 1727203899.02329: calling self._execute() 15896 1727203899.02782: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203899.02786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203899.02789: variable 'omit' from source: magic vars 15896 1727203899.03581: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.03585: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203899.03836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203899.04308: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203899.04360: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203899.04782: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203899.04786: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203899.05013: variable 'network_packages' from source: role '' defaults 15896 1727203899.05130: variable '__network_provider_setup' from source: role '' defaults 15896 1727203899.05149: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203899.05219: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203899.05393: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203899.05465: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203899.05871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203899.09845: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203899.10029: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203899.10221: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203899.10263: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203899.10681: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203899.10685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.10687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.10690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.10692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.10826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.10878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.10909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.11008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.11053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.11480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.11654: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203899.11867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.11955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.11989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.12038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.12058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.12158: variable 'ansible_python' from source: facts 15896 1727203899.12191: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203899.12426: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203899.12612: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203899.12730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.12761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.12794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.12861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.12880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.12924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.12958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.12991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.13034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.13119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.13255: variable 'network_connections' from source: task vars 15896 1727203899.13435: variable 'port1_profile' from source: play vars 15896 1727203899.13534: variable 'port1_profile' from source: play vars 15896 1727203899.13881: variable 'port2_profile' from source: play vars 15896 1727203899.13907: variable 'port2_profile' from source: play vars 15896 1727203899.13981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203899.14153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203899.14192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.14257: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203899.14311: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203899.14912: variable 'network_connections' from source: task vars 15896 1727203899.15024: variable 'port1_profile' from source: play vars 15896 1727203899.15197: variable 'port1_profile' from source: play vars 15896 1727203899.15321: variable 'port2_profile' from source: play vars 15896 1727203899.15488: variable 'port2_profile' from source: play vars 15896 1727203899.15587: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203899.15833: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203899.16343: variable 'network_connections' from source: task vars 15896 1727203899.16419: variable 'port1_profile' from source: play vars 15896 1727203899.16488: variable 'port1_profile' from source: play vars 15896 1727203899.16502: variable 'port2_profile' from source: play vars 15896 1727203899.16571: variable 'port2_profile' from source: play vars 15896 1727203899.16604: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203899.16687: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203899.17035: variable 'network_connections' from source: task vars 15896 1727203899.17045: variable 'port1_profile' from source: play vars 15896 1727203899.17114: variable 'port1_profile' from source: play vars 15896 1727203899.17186: variable 'port2_profile' from source: play vars 15896 1727203899.17229: variable 'port2_profile' from source: play vars 15896 1727203899.17293: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203899.17356: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203899.17481: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203899.17484: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203899.17648: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203899.18448: variable 'network_connections' from source: task vars 15896 1727203899.18459: variable 'port1_profile' from source: play vars 15896 1727203899.18548: variable 'port1_profile' from source: play vars 15896 1727203899.18570: variable 'port2_profile' from source: play vars 15896 1727203899.18651: variable 'port2_profile' from source: play vars 15896 1727203899.18680: variable 'ansible_distribution' from source: facts 15896 1727203899.18691: variable '__network_rh_distros' from source: role '' defaults 15896 1727203899.18702: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.18729: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203899.18992: variable 'ansible_distribution' from source: facts 15896 1727203899.19002: variable '__network_rh_distros' from source: role '' defaults 15896 1727203899.19011: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.19028: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203899.19381: variable 'ansible_distribution' from source: facts 15896 1727203899.19384: variable '__network_rh_distros' from source: role '' defaults 15896 1727203899.19387: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.19389: variable 'network_provider' from source: set_fact 15896 1727203899.19391: variable 'ansible_facts' from source: unknown 15896 1727203899.20112: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15896 1727203899.20120: when evaluation is False, skipping this task 15896 1727203899.20126: _execute() done 15896 1727203899.20132: dumping result to json 15896 1727203899.20138: done dumping result, returning 15896 1727203899.20150: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-fb83-b6ad-000000000123] 15896 1727203899.20162: sending task result for task 028d2410-947f-fb83-b6ad-000000000123 skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15896 1727203899.20345: no more pending results, returning what we have 15896 1727203899.20349: results queue empty 15896 1727203899.20349: checking for any_errors_fatal 15896 1727203899.20354: done checking for any_errors_fatal 15896 1727203899.20355: checking for max_fail_percentage 15896 1727203899.20357: done checking for max_fail_percentage 15896 1727203899.20358: checking to see if all hosts have failed and the running result is not ok 15896 1727203899.20358: done checking to see if all hosts have failed 15896 1727203899.20359: getting the remaining hosts for this loop 15896 1727203899.20360: done getting the remaining hosts for this loop 15896 1727203899.20364: getting the next task for host managed-node1 15896 1727203899.20371: done getting next task for host managed-node1 15896 1727203899.20380: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203899.20384: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203899.20402: getting variables 15896 1727203899.20408: in VariableManager get_vars() 15896 1727203899.20457: Calling all_inventory to load vars for managed-node1 15896 1727203899.20460: Calling groups_inventory to load vars for managed-node1 15896 1727203899.20462: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203899.20472: Calling all_plugins_play to load vars for managed-node1 15896 1727203899.20475: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203899.20646: Calling groups_plugins_play to load vars for managed-node1 15896 1727203899.21351: done sending task result for task 028d2410-947f-fb83-b6ad-000000000123 15896 1727203899.21354: WORKER PROCESS EXITING 15896 1727203899.22841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203899.24694: done with get_vars() 15896 1727203899.24725: done getting variables 15896 1727203899.24794: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:51:39 -0400 (0:00:00.243) 0:00:44.837 ***** 15896 1727203899.24830: entering _queue_task() for managed-node1/package 15896 1727203899.25620: worker is 1 (out of 1 available) 15896 1727203899.25634: exiting _queue_task() for managed-node1/package 15896 1727203899.25648: done queuing things up, now waiting for results queue to drain 15896 1727203899.25649: waiting for pending results... 15896 1727203899.25988: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203899.26133: in run() - task 028d2410-947f-fb83-b6ad-000000000124 15896 1727203899.26147: variable 'ansible_search_path' from source: unknown 15896 1727203899.26150: variable 'ansible_search_path' from source: unknown 15896 1727203899.26190: calling self._execute() 15896 1727203899.26299: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203899.26303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203899.26321: variable 'omit' from source: magic vars 15896 1727203899.26752: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.26756: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203899.26805: variable 'network_state' from source: role '' defaults 15896 1727203899.26816: Evaluated conditional (network_state != {}): False 15896 1727203899.26820: when evaluation is False, skipping this task 15896 1727203899.26822: _execute() done 15896 1727203899.26825: dumping result to json 15896 1727203899.26828: done dumping result, returning 15896 1727203899.26834: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-fb83-b6ad-000000000124] 15896 1727203899.26840: sending task result for task 028d2410-947f-fb83-b6ad-000000000124 15896 1727203899.26938: done sending task result for task 028d2410-947f-fb83-b6ad-000000000124 15896 1727203899.26941: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203899.27017: no more pending results, returning what we have 15896 1727203899.27023: results queue empty 15896 1727203899.27024: checking for any_errors_fatal 15896 1727203899.27031: done checking for any_errors_fatal 15896 1727203899.27032: checking for max_fail_percentage 15896 1727203899.27033: done checking for max_fail_percentage 15896 1727203899.27034: checking to see if all hosts have failed and the running result is not ok 15896 1727203899.27035: done checking to see if all hosts have failed 15896 1727203899.27035: getting the remaining hosts for this loop 15896 1727203899.27037: done getting the remaining hosts for this loop 15896 1727203899.27040: getting the next task for host managed-node1 15896 1727203899.27046: done getting next task for host managed-node1 15896 1727203899.27049: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203899.27053: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203899.27072: getting variables 15896 1727203899.27074: in VariableManager get_vars() 15896 1727203899.27121: Calling all_inventory to load vars for managed-node1 15896 1727203899.27123: Calling groups_inventory to load vars for managed-node1 15896 1727203899.27125: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203899.27133: Calling all_plugins_play to load vars for managed-node1 15896 1727203899.27135: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203899.27138: Calling groups_plugins_play to load vars for managed-node1 15896 1727203899.28821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203899.30404: done with get_vars() 15896 1727203899.30433: done getting variables 15896 1727203899.30499: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:51:39 -0400 (0:00:00.057) 0:00:44.894 ***** 15896 1727203899.30535: entering _queue_task() for managed-node1/package 15896 1727203899.30900: worker is 1 (out of 1 available) 15896 1727203899.30913: exiting _queue_task() for managed-node1/package 15896 1727203899.30925: done queuing things up, now waiting for results queue to drain 15896 1727203899.30926: waiting for pending results... 15896 1727203899.31211: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203899.31371: in run() - task 028d2410-947f-fb83-b6ad-000000000125 15896 1727203899.31401: variable 'ansible_search_path' from source: unknown 15896 1727203899.31510: variable 'ansible_search_path' from source: unknown 15896 1727203899.31513: calling self._execute() 15896 1727203899.31562: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203899.31577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203899.31889: variable 'omit' from source: magic vars 15896 1727203899.32481: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.32552: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203899.32796: variable 'network_state' from source: role '' defaults 15896 1727203899.32882: Evaluated conditional (network_state != {}): False 15896 1727203899.32977: when evaluation is False, skipping this task 15896 1727203899.32982: _execute() done 15896 1727203899.32984: dumping result to json 15896 1727203899.32987: done dumping result, returning 15896 1727203899.32989: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-fb83-b6ad-000000000125] 15896 1727203899.32992: sending task result for task 028d2410-947f-fb83-b6ad-000000000125 15896 1727203899.33063: done sending task result for task 028d2410-947f-fb83-b6ad-000000000125 15896 1727203899.33066: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203899.33125: no more pending results, returning what we have 15896 1727203899.33129: results queue empty 15896 1727203899.33130: checking for any_errors_fatal 15896 1727203899.33141: done checking for any_errors_fatal 15896 1727203899.33142: checking for max_fail_percentage 15896 1727203899.33144: done checking for max_fail_percentage 15896 1727203899.33145: checking to see if all hosts have failed and the running result is not ok 15896 1727203899.33145: done checking to see if all hosts have failed 15896 1727203899.33146: getting the remaining hosts for this loop 15896 1727203899.33148: done getting the remaining hosts for this loop 15896 1727203899.33152: getting the next task for host managed-node1 15896 1727203899.33158: done getting next task for host managed-node1 15896 1727203899.33165: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203899.33169: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203899.33192: getting variables 15896 1727203899.33195: in VariableManager get_vars() 15896 1727203899.33253: Calling all_inventory to load vars for managed-node1 15896 1727203899.33256: Calling groups_inventory to load vars for managed-node1 15896 1727203899.33261: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203899.33274: Calling all_plugins_play to load vars for managed-node1 15896 1727203899.33582: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203899.33586: Calling groups_plugins_play to load vars for managed-node1 15896 1727203899.36174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203899.37754: done with get_vars() 15896 1727203899.37791: done getting variables 15896 1727203899.37853: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:51:39 -0400 (0:00:00.073) 0:00:44.968 ***** 15896 1727203899.37894: entering _queue_task() for managed-node1/service 15896 1727203899.38270: worker is 1 (out of 1 available) 15896 1727203899.38287: exiting _queue_task() for managed-node1/service 15896 1727203899.38298: done queuing things up, now waiting for results queue to drain 15896 1727203899.38300: waiting for pending results... 15896 1727203899.38694: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203899.38748: in run() - task 028d2410-947f-fb83-b6ad-000000000126 15896 1727203899.38769: variable 'ansible_search_path' from source: unknown 15896 1727203899.38810: variable 'ansible_search_path' from source: unknown 15896 1727203899.38819: calling self._execute() 15896 1727203899.38927: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203899.38939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203899.38956: variable 'omit' from source: magic vars 15896 1727203899.39340: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.39464: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203899.39491: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203899.39692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203899.42278: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203899.42344: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203899.42403: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203899.42438: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203899.42466: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203899.42545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.42581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.42612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.42880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.42883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.42886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.42888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.42889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.42891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.42893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.42894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.42896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.42913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.42982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.43003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.43189: variable 'network_connections' from source: task vars 15896 1727203899.43208: variable 'port1_profile' from source: play vars 15896 1727203899.43294: variable 'port1_profile' from source: play vars 15896 1727203899.43339: variable 'port2_profile' from source: play vars 15896 1727203899.43382: variable 'port2_profile' from source: play vars 15896 1727203899.43464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203899.43633: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203899.43775: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203899.43779: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203899.43782: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203899.43805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203899.43832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203899.43866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.43903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203899.43956: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203899.44185: variable 'network_connections' from source: task vars 15896 1727203899.44195: variable 'port1_profile' from source: play vars 15896 1727203899.44265: variable 'port1_profile' from source: play vars 15896 1727203899.44281: variable 'port2_profile' from source: play vars 15896 1727203899.44348: variable 'port2_profile' from source: play vars 15896 1727203899.44382: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203899.44390: when evaluation is False, skipping this task 15896 1727203899.44397: _execute() done 15896 1727203899.44405: dumping result to json 15896 1727203899.44412: done dumping result, returning 15896 1727203899.44429: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000126] 15896 1727203899.44480: sending task result for task 028d2410-947f-fb83-b6ad-000000000126 15896 1727203899.44807: done sending task result for task 028d2410-947f-fb83-b6ad-000000000126 15896 1727203899.44811: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203899.44851: no more pending results, returning what we have 15896 1727203899.44854: results queue empty 15896 1727203899.44855: checking for any_errors_fatal 15896 1727203899.44862: done checking for any_errors_fatal 15896 1727203899.44863: checking for max_fail_percentage 15896 1727203899.44865: done checking for max_fail_percentage 15896 1727203899.44866: checking to see if all hosts have failed and the running result is not ok 15896 1727203899.44866: done checking to see if all hosts have failed 15896 1727203899.44867: getting the remaining hosts for this loop 15896 1727203899.44869: done getting the remaining hosts for this loop 15896 1727203899.44872: getting the next task for host managed-node1 15896 1727203899.44879: done getting next task for host managed-node1 15896 1727203899.44882: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203899.44885: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203899.44902: getting variables 15896 1727203899.44904: in VariableManager get_vars() 15896 1727203899.44954: Calling all_inventory to load vars for managed-node1 15896 1727203899.44956: Calling groups_inventory to load vars for managed-node1 15896 1727203899.44962: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203899.44972: Calling all_plugins_play to load vars for managed-node1 15896 1727203899.44978: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203899.44981: Calling groups_plugins_play to load vars for managed-node1 15896 1727203899.46662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203899.49336: done with get_vars() 15896 1727203899.49365: done getting variables 15896 1727203899.49628: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:51:39 -0400 (0:00:00.117) 0:00:45.085 ***** 15896 1727203899.49665: entering _queue_task() for managed-node1/service 15896 1727203899.50283: worker is 1 (out of 1 available) 15896 1727203899.50295: exiting _queue_task() for managed-node1/service 15896 1727203899.50309: done queuing things up, now waiting for results queue to drain 15896 1727203899.50311: waiting for pending results... 15896 1727203899.51064: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203899.51551: in run() - task 028d2410-947f-fb83-b6ad-000000000127 15896 1727203899.51579: variable 'ansible_search_path' from source: unknown 15896 1727203899.51618: variable 'ansible_search_path' from source: unknown 15896 1727203899.51633: calling self._execute() 15896 1727203899.51778: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203899.51789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203899.51803: variable 'omit' from source: magic vars 15896 1727203899.52381: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.52399: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203899.52591: variable 'network_provider' from source: set_fact 15896 1727203899.52595: variable 'network_state' from source: role '' defaults 15896 1727203899.52605: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15896 1727203899.52618: variable 'omit' from source: magic vars 15896 1727203899.52700: variable 'omit' from source: magic vars 15896 1727203899.52718: variable 'network_service_name' from source: role '' defaults 15896 1727203899.52919: variable 'network_service_name' from source: role '' defaults 15896 1727203899.52985: variable '__network_provider_setup' from source: role '' defaults 15896 1727203899.52997: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203899.53111: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203899.53126: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203899.53200: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203899.53882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203899.58395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203899.58399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203899.58424: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203899.58458: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203899.58493: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203899.58722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.58726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.58728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.58730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.58732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.58830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.58834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.58836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.58912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.58938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.59180: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203899.59306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.59329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.59354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.59395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.59415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.59592: variable 'ansible_python' from source: facts 15896 1727203899.59595: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203899.59614: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203899.59700: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203899.59869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.59895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.59979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.60017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.60065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.60194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203899.60216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203899.60316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.60393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203899.60409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203899.60796: variable 'network_connections' from source: task vars 15896 1727203899.60799: variable 'port1_profile' from source: play vars 15896 1727203899.60801: variable 'port1_profile' from source: play vars 15896 1727203899.60804: variable 'port2_profile' from source: play vars 15896 1727203899.60805: variable 'port2_profile' from source: play vars 15896 1727203899.60837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203899.61050: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203899.61481: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203899.61484: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203899.61487: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203899.61490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203899.61492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203899.61619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203899.61652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203899.61744: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203899.62482: variable 'network_connections' from source: task vars 15896 1727203899.62486: variable 'port1_profile' from source: play vars 15896 1727203899.62488: variable 'port1_profile' from source: play vars 15896 1727203899.62490: variable 'port2_profile' from source: play vars 15896 1727203899.62492: variable 'port2_profile' from source: play vars 15896 1727203899.62494: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203899.62563: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203899.62981: variable 'network_connections' from source: task vars 15896 1727203899.62984: variable 'port1_profile' from source: play vars 15896 1727203899.62986: variable 'port1_profile' from source: play vars 15896 1727203899.62989: variable 'port2_profile' from source: play vars 15896 1727203899.63029: variable 'port2_profile' from source: play vars 15896 1727203899.63051: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203899.63128: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203899.63651: variable 'network_connections' from source: task vars 15896 1727203899.63654: variable 'port1_profile' from source: play vars 15896 1727203899.63839: variable 'port1_profile' from source: play vars 15896 1727203899.63847: variable 'port2_profile' from source: play vars 15896 1727203899.64010: variable 'port2_profile' from source: play vars 15896 1727203899.64065: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203899.64277: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203899.64285: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203899.64391: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203899.64828: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203899.66006: variable 'network_connections' from source: task vars 15896 1727203899.66009: variable 'port1_profile' from source: play vars 15896 1727203899.66073: variable 'port1_profile' from source: play vars 15896 1727203899.66202: variable 'port2_profile' from source: play vars 15896 1727203899.66259: variable 'port2_profile' from source: play vars 15896 1727203899.66270: variable 'ansible_distribution' from source: facts 15896 1727203899.66273: variable '__network_rh_distros' from source: role '' defaults 15896 1727203899.66282: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.66302: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203899.66725: variable 'ansible_distribution' from source: facts 15896 1727203899.66729: variable '__network_rh_distros' from source: role '' defaults 15896 1727203899.66784: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.66798: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203899.67213: variable 'ansible_distribution' from source: facts 15896 1727203899.67216: variable '__network_rh_distros' from source: role '' defaults 15896 1727203899.67222: variable 'ansible_distribution_major_version' from source: facts 15896 1727203899.67259: variable 'network_provider' from source: set_fact 15896 1727203899.67399: variable 'omit' from source: magic vars 15896 1727203899.67427: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203899.67456: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203899.67478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203899.67495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203899.67593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203899.67796: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203899.67799: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203899.67802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203899.67912: Set connection var ansible_shell_type to sh 15896 1727203899.67918: Set connection var ansible_connection to ssh 15896 1727203899.67924: Set connection var ansible_shell_executable to /bin/sh 15896 1727203899.67929: Set connection var ansible_pipelining to False 15896 1727203899.67939: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203899.67942: Set connection var ansible_timeout to 10 15896 1727203899.67967: variable 'ansible_shell_executable' from source: unknown 15896 1727203899.67971: variable 'ansible_connection' from source: unknown 15896 1727203899.67973: variable 'ansible_module_compression' from source: unknown 15896 1727203899.67978: variable 'ansible_shell_type' from source: unknown 15896 1727203899.67985: variable 'ansible_shell_executable' from source: unknown 15896 1727203899.67991: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203899.67993: variable 'ansible_pipelining' from source: unknown 15896 1727203899.68160: variable 'ansible_timeout' from source: unknown 15896 1727203899.68164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203899.68505: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203899.68516: variable 'omit' from source: magic vars 15896 1727203899.68522: starting attempt loop 15896 1727203899.68524: running the handler 15896 1727203899.68833: variable 'ansible_facts' from source: unknown 15896 1727203899.70615: _low_level_execute_command(): starting 15896 1727203899.70641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203899.72162: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203899.72167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203899.72201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203899.72772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203899.72804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203899.74578: stdout chunk (state=3): >>>/root <<< 15896 1727203899.74706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203899.74724: stderr chunk (state=3): >>><<< 15896 1727203899.74730: stdout chunk (state=3): >>><<< 15896 1727203899.74758: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203899.74780: _low_level_execute_command(): starting 15896 1727203899.74785: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146 `" && echo ansible-tmp-1727203899.7475827-19246-117532893166146="` echo /root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146 `" ) && sleep 0' 15896 1727203899.76013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203899.76030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203899.76043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203899.76120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203899.76261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203899.76265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203899.76305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203899.76388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203899.78621: stdout chunk (state=3): >>>ansible-tmp-1727203899.7475827-19246-117532893166146=/root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146 <<< 15896 1727203899.78984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203899.79020: stderr chunk (state=3): >>><<< 15896 1727203899.79023: stdout chunk (state=3): >>><<< 15896 1727203899.79043: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203899.7475827-19246-117532893166146=/root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203899.79079: variable 'ansible_module_compression' from source: unknown 15896 1727203899.79202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15896 1727203899.79256: variable 'ansible_facts' from source: unknown 15896 1727203899.79795: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/AnsiballZ_systemd.py 15896 1727203899.80137: Sending initial data 15896 1727203899.80140: Sent initial data (156 bytes) 15896 1727203899.81320: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203899.81427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203899.81491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203899.81511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203899.81659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203899.83433: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203899.83533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203899.83682: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmprwkda3kv /root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/AnsiballZ_systemd.py <<< 15896 1727203899.83689: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/AnsiballZ_systemd.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmprwkda3kv" to remote "/root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/AnsiballZ_systemd.py" <<< 15896 1727203899.86471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203899.86658: stderr chunk (state=3): >>><<< 15896 1727203899.86661: stdout chunk (state=3): >>><<< 15896 1727203899.86732: done transferring module to remote 15896 1727203899.86742: _low_level_execute_command(): starting 15896 1727203899.86828: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/ /root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/AnsiballZ_systemd.py && sleep 0' 15896 1727203899.87954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203899.88045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203899.88057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15896 1727203899.88217: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203899.88221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203899.88224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203899.88444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203899.88551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203899.90879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203899.90883: stdout chunk (state=3): >>><<< 15896 1727203899.90885: stderr chunk (state=3): >>><<< 15896 1727203899.90888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203899.90890: _low_level_execute_command(): starting 15896 1727203899.90892: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/AnsiballZ_systemd.py && sleep 0' 15896 1727203899.92417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203899.92497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203899.92698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203899.92801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203899.92922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203900.24285: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10719232", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300601856", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "981463000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 15896 1727203900.24290: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network<<< 15896 1727203900.24299: stdout chunk (state=3): >>>-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15896 1727203900.26962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203900.26967: stdout chunk (state=3): >>><<< 15896 1727203900.26969: stderr chunk (state=3): >>><<< 15896 1727203900.26974: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10719232", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300601856", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "981463000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203900.27392: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203900.27769: _low_level_execute_command(): starting 15896 1727203900.27773: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203899.7475827-19246-117532893166146/ > /dev/null 2>&1 && sleep 0' 15896 1727203900.29184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203900.29200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203900.29212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203900.29490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203900.31509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203900.31513: stdout chunk (state=3): >>><<< 15896 1727203900.31519: stderr chunk (state=3): >>><<< 15896 1727203900.31537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203900.31544: handler run complete 15896 1727203900.31612: attempt loop complete, returning result 15896 1727203900.31615: _execute() done 15896 1727203900.31618: dumping result to json 15896 1727203900.31696: done dumping result, returning 15896 1727203900.31706: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-fb83-b6ad-000000000127] 15896 1727203900.31709: sending task result for task 028d2410-947f-fb83-b6ad-000000000127 15896 1727203900.32493: done sending task result for task 028d2410-947f-fb83-b6ad-000000000127 15896 1727203900.32496: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203900.32634: no more pending results, returning what we have 15896 1727203900.32638: results queue empty 15896 1727203900.32639: checking for any_errors_fatal 15896 1727203900.32645: done checking for any_errors_fatal 15896 1727203900.32646: checking for max_fail_percentage 15896 1727203900.32648: done checking for max_fail_percentage 15896 1727203900.32648: checking to see if all hosts have failed and the running result is not ok 15896 1727203900.32649: done checking to see if all hosts have failed 15896 1727203900.32650: getting the remaining hosts for this loop 15896 1727203900.32652: done getting the remaining hosts for this loop 15896 1727203900.32656: getting the next task for host managed-node1 15896 1727203900.32662: done getting next task for host managed-node1 15896 1727203900.32666: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203900.32669: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203900.32682: getting variables 15896 1727203900.32684: in VariableManager get_vars() 15896 1727203900.32732: Calling all_inventory to load vars for managed-node1 15896 1727203900.32734: Calling groups_inventory to load vars for managed-node1 15896 1727203900.32737: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203900.32746: Calling all_plugins_play to load vars for managed-node1 15896 1727203900.32750: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203900.32752: Calling groups_plugins_play to load vars for managed-node1 15896 1727203900.35888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203900.40601: done with get_vars() 15896 1727203900.40626: done getting variables 15896 1727203900.40692: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:51:40 -0400 (0:00:00.910) 0:00:45.996 ***** 15896 1727203900.40727: entering _queue_task() for managed-node1/service 15896 1727203900.42044: worker is 1 (out of 1 available) 15896 1727203900.42056: exiting _queue_task() for managed-node1/service 15896 1727203900.42068: done queuing things up, now waiting for results queue to drain 15896 1727203900.42070: waiting for pending results... 15896 1727203900.42894: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203900.43083: in run() - task 028d2410-947f-fb83-b6ad-000000000128 15896 1727203900.43087: variable 'ansible_search_path' from source: unknown 15896 1727203900.43090: variable 'ansible_search_path' from source: unknown 15896 1727203900.43092: calling self._execute() 15896 1727203900.43287: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203900.43300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203900.43317: variable 'omit' from source: magic vars 15896 1727203900.44135: variable 'ansible_distribution_major_version' from source: facts 15896 1727203900.44154: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203900.44270: variable 'network_provider' from source: set_fact 15896 1727203900.44390: Evaluated conditional (network_provider == "nm"): True 15896 1727203900.44481: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203900.44667: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203900.45243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203900.51515: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203900.51580: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203900.51757: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203900.51791: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203900.51817: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203900.52082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203900.52112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203900.52137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203900.52381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203900.52384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203900.52386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203900.52496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203900.52582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203900.52586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203900.52589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203900.52721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203900.52744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203900.52768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203900.52948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203900.52965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203900.53293: variable 'network_connections' from source: task vars 15896 1727203900.53369: variable 'port1_profile' from source: play vars 15896 1727203900.53580: variable 'port1_profile' from source: play vars 15896 1727203900.53584: variable 'port2_profile' from source: play vars 15896 1727203900.53586: variable 'port2_profile' from source: play vars 15896 1727203900.53740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203900.54040: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203900.54078: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203900.54421: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203900.54424: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203900.54681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203900.54684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203900.54686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203900.54689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203900.54692: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203900.55143: variable 'network_connections' from source: task vars 15896 1727203900.55149: variable 'port1_profile' from source: play vars 15896 1727203900.55294: variable 'port1_profile' from source: play vars 15896 1727203900.55301: variable 'port2_profile' from source: play vars 15896 1727203900.55481: variable 'port2_profile' from source: play vars 15896 1727203900.55499: Evaluated conditional (__network_wpa_supplicant_required): False 15896 1727203900.55502: when evaluation is False, skipping this task 15896 1727203900.55514: _execute() done 15896 1727203900.55516: dumping result to json 15896 1727203900.55519: done dumping result, returning 15896 1727203900.55522: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-fb83-b6ad-000000000128] 15896 1727203900.55524: sending task result for task 028d2410-947f-fb83-b6ad-000000000128 15896 1727203900.55729: done sending task result for task 028d2410-947f-fb83-b6ad-000000000128 15896 1727203900.55732: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15896 1727203900.55805: no more pending results, returning what we have 15896 1727203900.55809: results queue empty 15896 1727203900.55809: checking for any_errors_fatal 15896 1727203900.55838: done checking for any_errors_fatal 15896 1727203900.55839: checking for max_fail_percentage 15896 1727203900.55841: done checking for max_fail_percentage 15896 1727203900.55842: checking to see if all hosts have failed and the running result is not ok 15896 1727203900.55843: done checking to see if all hosts have failed 15896 1727203900.55843: getting the remaining hosts for this loop 15896 1727203900.55845: done getting the remaining hosts for this loop 15896 1727203900.55849: getting the next task for host managed-node1 15896 1727203900.55857: done getting next task for host managed-node1 15896 1727203900.55862: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203900.55865: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203900.55887: getting variables 15896 1727203900.55889: in VariableManager get_vars() 15896 1727203900.55942: Calling all_inventory to load vars for managed-node1 15896 1727203900.55945: Calling groups_inventory to load vars for managed-node1 15896 1727203900.55948: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203900.55958: Calling all_plugins_play to load vars for managed-node1 15896 1727203900.55962: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203900.55965: Calling groups_plugins_play to load vars for managed-node1 15896 1727203900.59192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203900.63997: done with get_vars() 15896 1727203900.64029: done getting variables 15896 1727203900.64092: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:51:40 -0400 (0:00:00.233) 0:00:46.230 ***** 15896 1727203900.64124: entering _queue_task() for managed-node1/service 15896 1727203900.65292: worker is 1 (out of 1 available) 15896 1727203900.65303: exiting _queue_task() for managed-node1/service 15896 1727203900.65317: done queuing things up, now waiting for results queue to drain 15896 1727203900.65318: waiting for pending results... 15896 1727203900.65831: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203900.66127: in run() - task 028d2410-947f-fb83-b6ad-000000000129 15896 1727203900.66152: variable 'ansible_search_path' from source: unknown 15896 1727203900.66163: variable 'ansible_search_path' from source: unknown 15896 1727203900.66208: calling self._execute() 15896 1727203900.66508: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203900.66521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203900.66537: variable 'omit' from source: magic vars 15896 1727203900.67336: variable 'ansible_distribution_major_version' from source: facts 15896 1727203900.67441: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203900.67682: variable 'network_provider' from source: set_fact 15896 1727203900.67694: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203900.67702: when evaluation is False, skipping this task 15896 1727203900.67709: _execute() done 15896 1727203900.67717: dumping result to json 15896 1727203900.67725: done dumping result, returning 15896 1727203900.67735: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-fb83-b6ad-000000000129] 15896 1727203900.67745: sending task result for task 028d2410-947f-fb83-b6ad-000000000129 15896 1727203900.68152: done sending task result for task 028d2410-947f-fb83-b6ad-000000000129 15896 1727203900.68156: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203900.68201: no more pending results, returning what we have 15896 1727203900.68205: results queue empty 15896 1727203900.68205: checking for any_errors_fatal 15896 1727203900.68214: done checking for any_errors_fatal 15896 1727203900.68214: checking for max_fail_percentage 15896 1727203900.68216: done checking for max_fail_percentage 15896 1727203900.68217: checking to see if all hosts have failed and the running result is not ok 15896 1727203900.68217: done checking to see if all hosts have failed 15896 1727203900.68218: getting the remaining hosts for this loop 15896 1727203900.68219: done getting the remaining hosts for this loop 15896 1727203900.68222: getting the next task for host managed-node1 15896 1727203900.68229: done getting next task for host managed-node1 15896 1727203900.68233: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203900.68236: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203900.68258: getting variables 15896 1727203900.68259: in VariableManager get_vars() 15896 1727203900.68313: Calling all_inventory to load vars for managed-node1 15896 1727203900.68316: Calling groups_inventory to load vars for managed-node1 15896 1727203900.68318: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203900.68330: Calling all_plugins_play to load vars for managed-node1 15896 1727203900.68333: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203900.68336: Calling groups_plugins_play to load vars for managed-node1 15896 1727203900.71813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203900.74708: done with get_vars() 15896 1727203900.74734: done getting variables 15896 1727203900.75202: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:51:40 -0400 (0:00:00.111) 0:00:46.341 ***** 15896 1727203900.75236: entering _queue_task() for managed-node1/copy 15896 1727203900.76411: worker is 1 (out of 1 available) 15896 1727203900.76422: exiting _queue_task() for managed-node1/copy 15896 1727203900.76431: done queuing things up, now waiting for results queue to drain 15896 1727203900.76432: waiting for pending results... 15896 1727203900.76599: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203900.76920: in run() - task 028d2410-947f-fb83-b6ad-00000000012a 15896 1727203900.76942: variable 'ansible_search_path' from source: unknown 15896 1727203900.76950: variable 'ansible_search_path' from source: unknown 15896 1727203900.77208: calling self._execute() 15896 1727203900.77273: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203900.77327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203900.77534: variable 'omit' from source: magic vars 15896 1727203900.78381: variable 'ansible_distribution_major_version' from source: facts 15896 1727203900.78385: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203900.78535: variable 'network_provider' from source: set_fact 15896 1727203900.78545: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203900.78552: when evaluation is False, skipping this task 15896 1727203900.78558: _execute() done 15896 1727203900.78569: dumping result to json 15896 1727203900.78578: done dumping result, returning 15896 1727203900.78590: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-fb83-b6ad-00000000012a] 15896 1727203900.78626: sending task result for task 028d2410-947f-fb83-b6ad-00000000012a skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203900.78794: no more pending results, returning what we have 15896 1727203900.78798: results queue empty 15896 1727203900.78799: checking for any_errors_fatal 15896 1727203900.78805: done checking for any_errors_fatal 15896 1727203900.78805: checking for max_fail_percentage 15896 1727203900.78807: done checking for max_fail_percentage 15896 1727203900.78808: checking to see if all hosts have failed and the running result is not ok 15896 1727203900.78808: done checking to see if all hosts have failed 15896 1727203900.78809: getting the remaining hosts for this loop 15896 1727203900.78811: done getting the remaining hosts for this loop 15896 1727203900.78814: getting the next task for host managed-node1 15896 1727203900.78821: done getting next task for host managed-node1 15896 1727203900.78825: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203900.78828: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203900.78852: getting variables 15896 1727203900.78854: in VariableManager get_vars() 15896 1727203900.78909: Calling all_inventory to load vars for managed-node1 15896 1727203900.78911: Calling groups_inventory to load vars for managed-node1 15896 1727203900.78913: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203900.78927: Calling all_plugins_play to load vars for managed-node1 15896 1727203900.78930: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203900.78934: Calling groups_plugins_play to load vars for managed-node1 15896 1727203900.79455: done sending task result for task 028d2410-947f-fb83-b6ad-00000000012a 15896 1727203900.79458: WORKER PROCESS EXITING 15896 1727203900.82908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203900.86982: done with get_vars() 15896 1727203900.87015: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:51:40 -0400 (0:00:00.122) 0:00:46.464 ***** 15896 1727203900.87509: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203900.88716: worker is 1 (out of 1 available) 15896 1727203900.88725: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203900.88737: done queuing things up, now waiting for results queue to drain 15896 1727203900.88739: waiting for pending results... 15896 1727203900.88801: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203900.89111: in run() - task 028d2410-947f-fb83-b6ad-00000000012b 15896 1727203900.89200: variable 'ansible_search_path' from source: unknown 15896 1727203900.89208: variable 'ansible_search_path' from source: unknown 15896 1727203900.89248: calling self._execute() 15896 1727203900.89494: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203900.89881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203900.89885: variable 'omit' from source: magic vars 15896 1727203900.90582: variable 'ansible_distribution_major_version' from source: facts 15896 1727203900.90586: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203900.90588: variable 'omit' from source: magic vars 15896 1727203900.90590: variable 'omit' from source: magic vars 15896 1727203900.90783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203900.97340: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203900.97640: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203900.97981: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203900.97985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203900.97987: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203900.98056: variable 'network_provider' from source: set_fact 15896 1727203900.98382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203900.98428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203900.98463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203900.98585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203900.98606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203900.98731: variable 'omit' from source: magic vars 15896 1727203900.98977: variable 'omit' from source: magic vars 15896 1727203900.99580: variable 'network_connections' from source: task vars 15896 1727203900.99584: variable 'port1_profile' from source: play vars 15896 1727203900.99586: variable 'port1_profile' from source: play vars 15896 1727203900.99961: variable 'port2_profile' from source: play vars 15896 1727203900.99965: variable 'port2_profile' from source: play vars 15896 1727203901.00683: variable 'omit' from source: magic vars 15896 1727203901.00687: variable '__lsr_ansible_managed' from source: task vars 15896 1727203901.00689: variable '__lsr_ansible_managed' from source: task vars 15896 1727203901.01092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15896 1727203901.01649: Loaded config def from plugin (lookup/template) 15896 1727203901.01663: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15896 1727203901.01698: File lookup term: get_ansible_managed.j2 15896 1727203901.01705: variable 'ansible_search_path' from source: unknown 15896 1727203901.01766: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15896 1727203901.01786: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15896 1727203901.01809: variable 'ansible_search_path' from source: unknown 15896 1727203901.20669: variable 'ansible_managed' from source: unknown 15896 1727203901.21402: variable 'omit' from source: magic vars 15896 1727203901.21406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203901.21409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203901.21411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203901.21514: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203901.21680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203901.21684: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203901.21686: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203901.21688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203901.21962: Set connection var ansible_shell_type to sh 15896 1727203901.22166: Set connection var ansible_connection to ssh 15896 1727203901.22169: Set connection var ansible_shell_executable to /bin/sh 15896 1727203901.22171: Set connection var ansible_pipelining to False 15896 1727203901.22172: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203901.22174: Set connection var ansible_timeout to 10 15896 1727203901.22178: variable 'ansible_shell_executable' from source: unknown 15896 1727203901.22180: variable 'ansible_connection' from source: unknown 15896 1727203901.22181: variable 'ansible_module_compression' from source: unknown 15896 1727203901.22183: variable 'ansible_shell_type' from source: unknown 15896 1727203901.22185: variable 'ansible_shell_executable' from source: unknown 15896 1727203901.22187: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203901.22189: variable 'ansible_pipelining' from source: unknown 15896 1727203901.22190: variable 'ansible_timeout' from source: unknown 15896 1727203901.22192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203901.22683: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203901.22696: variable 'omit' from source: magic vars 15896 1727203901.22698: starting attempt loop 15896 1727203901.22700: running the handler 15896 1727203901.22702: _low_level_execute_command(): starting 15896 1727203901.22704: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203901.23995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203901.24226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203901.24391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203901.26117: stdout chunk (state=3): >>>/root <<< 15896 1727203901.26408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203901.26411: stdout chunk (state=3): >>><<< 15896 1727203901.26413: stderr chunk (state=3): >>><<< 15896 1727203901.26482: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203901.26486: _low_level_execute_command(): starting 15896 1727203901.26489: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021 `" && echo ansible-tmp-1727203901.2643278-19375-177561588168021="` echo /root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021 `" ) && sleep 0' 15896 1727203901.27635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203901.27682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203901.27685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203901.27688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203901.27690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203901.27692: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203901.27725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203901.27729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203901.27731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203901.27866: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203901.27891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203901.27912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203901.28035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203901.30119: stdout chunk (state=3): >>>ansible-tmp-1727203901.2643278-19375-177561588168021=/root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021 <<< 15896 1727203901.30382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203901.30386: stdout chunk (state=3): >>><<< 15896 1727203901.30389: stderr chunk (state=3): >>><<< 15896 1727203901.30552: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203901.2643278-19375-177561588168021=/root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203901.30556: variable 'ansible_module_compression' from source: unknown 15896 1727203901.30665: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15896 1727203901.30714: variable 'ansible_facts' from source: unknown 15896 1727203901.31092: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/AnsiballZ_network_connections.py 15896 1727203901.31547: Sending initial data 15896 1727203901.31551: Sent initial data (168 bytes) 15896 1727203901.32626: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203901.32633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203901.32636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203901.32882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203901.33096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203901.34725: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203901.34832: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203901.34940: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpwof7k3mp /root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/AnsiballZ_network_connections.py <<< 15896 1727203901.34944: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/AnsiballZ_network_connections.py" <<< 15896 1727203901.35231: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpwof7k3mp" to remote "/root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/AnsiballZ_network_connections.py" <<< 15896 1727203901.37740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203901.37744: stdout chunk (state=3): >>><<< 15896 1727203901.37747: stderr chunk (state=3): >>><<< 15896 1727203901.37749: done transferring module to remote 15896 1727203901.37751: _low_level_execute_command(): starting 15896 1727203901.37753: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/ /root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/AnsiballZ_network_connections.py && sleep 0' 15896 1727203901.39114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203901.39411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203901.39514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203901.41581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203901.41585: stdout chunk (state=3): >>><<< 15896 1727203901.41592: stderr chunk (state=3): >>><<< 15896 1727203901.41632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203901.41636: _low_level_execute_command(): starting 15896 1727203901.41638: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/AnsiballZ_network_connections.py && sleep 0' 15896 1727203901.43195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203901.43199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203901.43327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203901.43331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203901.43334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203901.43461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203901.43603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203901.43695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203901.88391: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yq0biuw3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yq0biuw3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/b00f4d37-e3cf-44e8-8e62-a68de6442d0c: error=unknown <<< 15896 1727203901.90542: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yq0biuw3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yq0biuw3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/c5333c4a-95d0-44cb-b04e-077f82270820: error=unknown <<< 15896 1727203901.90744: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15896 1727203901.93066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203901.93070: stdout chunk (state=3): >>><<< 15896 1727203901.93072: stderr chunk (state=3): >>><<< 15896 1727203901.93077: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yq0biuw3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yq0biuw3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/b00f4d37-e3cf-44e8-8e62-a68de6442d0c: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yq0biuw3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_yq0biuw3/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/c5333c4a-95d0-44cb-b04e-077f82270820: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203901.93079: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203901.93082: _low_level_execute_command(): starting 15896 1727203901.93084: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203901.2643278-19375-177561588168021/ > /dev/null 2>&1 && sleep 0' 15896 1727203901.94416: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203901.94591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203901.94784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203901.94871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203901.97138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203901.97146: stdout chunk (state=3): >>><<< 15896 1727203901.97149: stderr chunk (state=3): >>><<< 15896 1727203901.97152: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203901.97156: handler run complete 15896 1727203901.97159: attempt loop complete, returning result 15896 1727203901.97161: _execute() done 15896 1727203901.97163: dumping result to json 15896 1727203901.97166: done dumping result, returning 15896 1727203901.97168: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-fb83-b6ad-00000000012b] 15896 1727203901.97247: sending task result for task 028d2410-947f-fb83-b6ad-00000000012b 15896 1727203901.97700: done sending task result for task 028d2410-947f-fb83-b6ad-00000000012b 15896 1727203901.97704: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15896 1727203901.98087: no more pending results, returning what we have 15896 1727203901.98092: results queue empty 15896 1727203901.98093: checking for any_errors_fatal 15896 1727203901.98099: done checking for any_errors_fatal 15896 1727203901.98100: checking for max_fail_percentage 15896 1727203901.98102: done checking for max_fail_percentage 15896 1727203901.98102: checking to see if all hosts have failed and the running result is not ok 15896 1727203901.98103: done checking to see if all hosts have failed 15896 1727203901.98104: getting the remaining hosts for this loop 15896 1727203901.98105: done getting the remaining hosts for this loop 15896 1727203901.98109: getting the next task for host managed-node1 15896 1727203901.98238: done getting next task for host managed-node1 15896 1727203901.98247: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203901.98250: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203901.98267: getting variables 15896 1727203901.98269: in VariableManager get_vars() 15896 1727203901.98331: Calling all_inventory to load vars for managed-node1 15896 1727203901.98334: Calling groups_inventory to load vars for managed-node1 15896 1727203901.98683: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203901.98698: Calling all_plugins_play to load vars for managed-node1 15896 1727203901.98702: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203901.98705: Calling groups_plugins_play to load vars for managed-node1 15896 1727203902.02352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203902.07073: done with get_vars() 15896 1727203902.07247: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:51:42 -0400 (0:00:01.199) 0:00:47.663 ***** 15896 1727203902.07533: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203902.08800: worker is 1 (out of 1 available) 15896 1727203902.08812: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203902.08823: done queuing things up, now waiting for results queue to drain 15896 1727203902.08825: waiting for pending results... 15896 1727203902.09445: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203902.09838: in run() - task 028d2410-947f-fb83-b6ad-00000000012c 15896 1727203902.09843: variable 'ansible_search_path' from source: unknown 15896 1727203902.09845: variable 'ansible_search_path' from source: unknown 15896 1727203902.09849: calling self._execute() 15896 1727203902.10138: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.10141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.10154: variable 'omit' from source: magic vars 15896 1727203902.11278: variable 'ansible_distribution_major_version' from source: facts 15896 1727203902.11281: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203902.11561: variable 'network_state' from source: role '' defaults 15896 1727203902.11579: Evaluated conditional (network_state != {}): False 15896 1727203902.11586: when evaluation is False, skipping this task 15896 1727203902.11635: _execute() done 15896 1727203902.11639: dumping result to json 15896 1727203902.11641: done dumping result, returning 15896 1727203902.11650: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-fb83-b6ad-00000000012c] 15896 1727203902.11656: sending task result for task 028d2410-947f-fb83-b6ad-00000000012c 15896 1727203902.11870: done sending task result for task 028d2410-947f-fb83-b6ad-00000000012c 15896 1727203902.11872: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203902.12004: no more pending results, returning what we have 15896 1727203902.12008: results queue empty 15896 1727203902.12009: checking for any_errors_fatal 15896 1727203902.12189: done checking for any_errors_fatal 15896 1727203902.12191: checking for max_fail_percentage 15896 1727203902.12193: done checking for max_fail_percentage 15896 1727203902.12194: checking to see if all hosts have failed and the running result is not ok 15896 1727203902.12195: done checking to see if all hosts have failed 15896 1727203902.12195: getting the remaining hosts for this loop 15896 1727203902.12197: done getting the remaining hosts for this loop 15896 1727203902.12200: getting the next task for host managed-node1 15896 1727203902.12206: done getting next task for host managed-node1 15896 1727203902.12210: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203902.12213: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203902.12232: getting variables 15896 1727203902.12234: in VariableManager get_vars() 15896 1727203902.12322: Calling all_inventory to load vars for managed-node1 15896 1727203902.12325: Calling groups_inventory to load vars for managed-node1 15896 1727203902.12328: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203902.12338: Calling all_plugins_play to load vars for managed-node1 15896 1727203902.12341: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203902.12343: Calling groups_plugins_play to load vars for managed-node1 15896 1727203902.16031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203902.19783: done with get_vars() 15896 1727203902.19809: done getting variables 15896 1727203902.19905: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:51:42 -0400 (0:00:00.125) 0:00:47.788 ***** 15896 1727203902.19939: entering _queue_task() for managed-node1/debug 15896 1727203902.20694: worker is 1 (out of 1 available) 15896 1727203902.20773: exiting _queue_task() for managed-node1/debug 15896 1727203902.20786: done queuing things up, now waiting for results queue to drain 15896 1727203902.20789: waiting for pending results... 15896 1727203902.21355: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203902.21496: in run() - task 028d2410-947f-fb83-b6ad-00000000012d 15896 1727203902.21500: variable 'ansible_search_path' from source: unknown 15896 1727203902.21502: variable 'ansible_search_path' from source: unknown 15896 1727203902.21604: calling self._execute() 15896 1727203902.21863: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.21872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.21886: variable 'omit' from source: magic vars 15896 1727203902.23223: variable 'ansible_distribution_major_version' from source: facts 15896 1727203902.23227: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203902.23229: variable 'omit' from source: magic vars 15896 1727203902.23654: variable 'omit' from source: magic vars 15896 1727203902.23699: variable 'omit' from source: magic vars 15896 1727203902.23742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203902.24002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203902.24025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203902.24043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203902.24056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203902.24092: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203902.24095: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.24098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.24573: Set connection var ansible_shell_type to sh 15896 1727203902.24586: Set connection var ansible_connection to ssh 15896 1727203902.24700: Set connection var ansible_shell_executable to /bin/sh 15896 1727203902.24704: Set connection var ansible_pipelining to False 15896 1727203902.24706: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203902.24708: Set connection var ansible_timeout to 10 15896 1727203902.24710: variable 'ansible_shell_executable' from source: unknown 15896 1727203902.24712: variable 'ansible_connection' from source: unknown 15896 1727203902.24715: variable 'ansible_module_compression' from source: unknown 15896 1727203902.24717: variable 'ansible_shell_type' from source: unknown 15896 1727203902.24719: variable 'ansible_shell_executable' from source: unknown 15896 1727203902.24720: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.24722: variable 'ansible_pipelining' from source: unknown 15896 1727203902.24791: variable 'ansible_timeout' from source: unknown 15896 1727203902.24796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.25232: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203902.25245: variable 'omit' from source: magic vars 15896 1727203902.25250: starting attempt loop 15896 1727203902.25253: running the handler 15896 1727203902.25494: variable '__network_connections_result' from source: set_fact 15896 1727203902.25705: handler run complete 15896 1727203902.25720: attempt loop complete, returning result 15896 1727203902.25725: _execute() done 15896 1727203902.25728: dumping result to json 15896 1727203902.25780: done dumping result, returning 15896 1727203902.25861: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-fb83-b6ad-00000000012d] 15896 1727203902.25868: sending task result for task 028d2410-947f-fb83-b6ad-00000000012d 15896 1727203902.25967: done sending task result for task 028d2410-947f-fb83-b6ad-00000000012d 15896 1727203902.25970: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15896 1727203902.26074: no more pending results, returning what we have 15896 1727203902.26079: results queue empty 15896 1727203902.26079: checking for any_errors_fatal 15896 1727203902.26088: done checking for any_errors_fatal 15896 1727203902.26088: checking for max_fail_percentage 15896 1727203902.26090: done checking for max_fail_percentage 15896 1727203902.26091: checking to see if all hosts have failed and the running result is not ok 15896 1727203902.26091: done checking to see if all hosts have failed 15896 1727203902.26092: getting the remaining hosts for this loop 15896 1727203902.26093: done getting the remaining hosts for this loop 15896 1727203902.26096: getting the next task for host managed-node1 15896 1727203902.26102: done getting next task for host managed-node1 15896 1727203902.26107: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203902.26110: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203902.26121: getting variables 15896 1727203902.26123: in VariableManager get_vars() 15896 1727203902.26175: Calling all_inventory to load vars for managed-node1 15896 1727203902.26391: Calling groups_inventory to load vars for managed-node1 15896 1727203902.26394: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203902.26405: Calling all_plugins_play to load vars for managed-node1 15896 1727203902.26408: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203902.26411: Calling groups_plugins_play to load vars for managed-node1 15896 1727203902.29123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203902.32577: done with get_vars() 15896 1727203902.32612: done getting variables 15896 1727203902.32682: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:51:42 -0400 (0:00:00.128) 0:00:47.917 ***** 15896 1727203902.32831: entering _queue_task() for managed-node1/debug 15896 1727203902.33572: worker is 1 (out of 1 available) 15896 1727203902.33589: exiting _queue_task() for managed-node1/debug 15896 1727203902.33678: done queuing things up, now waiting for results queue to drain 15896 1727203902.33681: waiting for pending results... 15896 1727203902.34380: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203902.34756: in run() - task 028d2410-947f-fb83-b6ad-00000000012e 15896 1727203902.34773: variable 'ansible_search_path' from source: unknown 15896 1727203902.34808: variable 'ansible_search_path' from source: unknown 15896 1727203902.35002: calling self._execute() 15896 1727203902.35313: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.35319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.35348: variable 'omit' from source: magic vars 15896 1727203902.36681: variable 'ansible_distribution_major_version' from source: facts 15896 1727203902.36686: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203902.36689: variable 'omit' from source: magic vars 15896 1727203902.36724: variable 'omit' from source: magic vars 15896 1727203902.36915: variable 'omit' from source: magic vars 15896 1727203902.36918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203902.36949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203902.36970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203902.37100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203902.37111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203902.37146: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203902.37149: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.37152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.37350: Set connection var ansible_shell_type to sh 15896 1727203902.37358: Set connection var ansible_connection to ssh 15896 1727203902.37367: Set connection var ansible_shell_executable to /bin/sh 15896 1727203902.37372: Set connection var ansible_pipelining to False 15896 1727203902.37488: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203902.37495: Set connection var ansible_timeout to 10 15896 1727203902.37518: variable 'ansible_shell_executable' from source: unknown 15896 1727203902.37527: variable 'ansible_connection' from source: unknown 15896 1727203902.37530: variable 'ansible_module_compression' from source: unknown 15896 1727203902.37533: variable 'ansible_shell_type' from source: unknown 15896 1727203902.37535: variable 'ansible_shell_executable' from source: unknown 15896 1727203902.37537: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.37542: variable 'ansible_pipelining' from source: unknown 15896 1727203902.37545: variable 'ansible_timeout' from source: unknown 15896 1727203902.37549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.37904: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203902.37915: variable 'omit' from source: magic vars 15896 1727203902.37921: starting attempt loop 15896 1727203902.37924: running the handler 15896 1727203902.38090: variable '__network_connections_result' from source: set_fact 15896 1727203902.38166: variable '__network_connections_result' from source: set_fact 15896 1727203902.38514: handler run complete 15896 1727203902.38545: attempt loop complete, returning result 15896 1727203902.38548: _execute() done 15896 1727203902.38551: dumping result to json 15896 1727203902.38553: done dumping result, returning 15896 1727203902.38656: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-fb83-b6ad-00000000012e] 15896 1727203902.38659: sending task result for task 028d2410-947f-fb83-b6ad-00000000012e ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15896 1727203902.38857: no more pending results, returning what we have 15896 1727203902.38861: results queue empty 15896 1727203902.38862: checking for any_errors_fatal 15896 1727203902.39082: done checking for any_errors_fatal 15896 1727203902.39083: checking for max_fail_percentage 15896 1727203902.39085: done checking for max_fail_percentage 15896 1727203902.39086: checking to see if all hosts have failed and the running result is not ok 15896 1727203902.39087: done checking to see if all hosts have failed 15896 1727203902.39087: getting the remaining hosts for this loop 15896 1727203902.39089: done getting the remaining hosts for this loop 15896 1727203902.39092: getting the next task for host managed-node1 15896 1727203902.39098: done getting next task for host managed-node1 15896 1727203902.39102: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203902.39104: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203902.39115: getting variables 15896 1727203902.39117: in VariableManager get_vars() 15896 1727203902.39162: Calling all_inventory to load vars for managed-node1 15896 1727203902.39165: Calling groups_inventory to load vars for managed-node1 15896 1727203902.39167: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203902.39181: Calling all_plugins_play to load vars for managed-node1 15896 1727203902.39185: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203902.39189: Calling groups_plugins_play to load vars for managed-node1 15896 1727203902.39953: done sending task result for task 028d2410-947f-fb83-b6ad-00000000012e 15896 1727203902.39957: WORKER PROCESS EXITING 15896 1727203902.42212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203902.45336: done with get_vars() 15896 1727203902.45365: done getting variables 15896 1727203902.45637: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:51:42 -0400 (0:00:00.128) 0:00:48.046 ***** 15896 1727203902.45679: entering _queue_task() for managed-node1/debug 15896 1727203902.46531: worker is 1 (out of 1 available) 15896 1727203902.46543: exiting _queue_task() for managed-node1/debug 15896 1727203902.46555: done queuing things up, now waiting for results queue to drain 15896 1727203902.46557: waiting for pending results... 15896 1727203902.47069: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203902.47360: in run() - task 028d2410-947f-fb83-b6ad-00000000012f 15896 1727203902.47379: variable 'ansible_search_path' from source: unknown 15896 1727203902.47383: variable 'ansible_search_path' from source: unknown 15896 1727203902.47416: calling self._execute() 15896 1727203902.47735: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.47740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.47759: variable 'omit' from source: magic vars 15896 1727203902.48543: variable 'ansible_distribution_major_version' from source: facts 15896 1727203902.48554: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203902.48898: variable 'network_state' from source: role '' defaults 15896 1727203902.48964: Evaluated conditional (network_state != {}): False 15896 1727203902.48967: when evaluation is False, skipping this task 15896 1727203902.48970: _execute() done 15896 1727203902.48972: dumping result to json 15896 1727203902.48974: done dumping result, returning 15896 1727203902.48979: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-fb83-b6ad-00000000012f] 15896 1727203902.48980: sending task result for task 028d2410-947f-fb83-b6ad-00000000012f 15896 1727203902.49048: done sending task result for task 028d2410-947f-fb83-b6ad-00000000012f 15896 1727203902.49053: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 15896 1727203902.49112: no more pending results, returning what we have 15896 1727203902.49115: results queue empty 15896 1727203902.49116: checking for any_errors_fatal 15896 1727203902.49127: done checking for any_errors_fatal 15896 1727203902.49128: checking for max_fail_percentage 15896 1727203902.49130: done checking for max_fail_percentage 15896 1727203902.49130: checking to see if all hosts have failed and the running result is not ok 15896 1727203902.49131: done checking to see if all hosts have failed 15896 1727203902.49131: getting the remaining hosts for this loop 15896 1727203902.49133: done getting the remaining hosts for this loop 15896 1727203902.49136: getting the next task for host managed-node1 15896 1727203902.49143: done getting next task for host managed-node1 15896 1727203902.49146: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203902.49149: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203902.49169: getting variables 15896 1727203902.49171: in VariableManager get_vars() 15896 1727203902.49230: Calling all_inventory to load vars for managed-node1 15896 1727203902.49233: Calling groups_inventory to load vars for managed-node1 15896 1727203902.49235: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203902.49247: Calling all_plugins_play to load vars for managed-node1 15896 1727203902.49251: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203902.49254: Calling groups_plugins_play to load vars for managed-node1 15896 1727203902.51986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203902.55138: done with get_vars() 15896 1727203902.55166: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:51:42 -0400 (0:00:00.096) 0:00:48.143 ***** 15896 1727203902.55373: entering _queue_task() for managed-node1/ping 15896 1727203902.56223: worker is 1 (out of 1 available) 15896 1727203902.56235: exiting _queue_task() for managed-node1/ping 15896 1727203902.56247: done queuing things up, now waiting for results queue to drain 15896 1727203902.56249: waiting for pending results... 15896 1727203902.56985: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203902.57270: in run() - task 028d2410-947f-fb83-b6ad-000000000130 15896 1727203902.57285: variable 'ansible_search_path' from source: unknown 15896 1727203902.57292: variable 'ansible_search_path' from source: unknown 15896 1727203902.57456: calling self._execute() 15896 1727203902.57781: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.57785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.57788: variable 'omit' from source: magic vars 15896 1727203902.58793: variable 'ansible_distribution_major_version' from source: facts 15896 1727203902.58797: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203902.58802: variable 'omit' from source: magic vars 15896 1727203902.58987: variable 'omit' from source: magic vars 15896 1727203902.59083: variable 'omit' from source: magic vars 15896 1727203902.59237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203902.59304: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203902.59332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203902.59352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203902.59372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203902.59581: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203902.59588: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.59677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.59927: Set connection var ansible_shell_type to sh 15896 1727203902.59937: Set connection var ansible_connection to ssh 15896 1727203902.59945: Set connection var ansible_shell_executable to /bin/sh 15896 1727203902.59947: Set connection var ansible_pipelining to False 15896 1727203902.59955: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203902.59958: Set connection var ansible_timeout to 10 15896 1727203902.60003: variable 'ansible_shell_executable' from source: unknown 15896 1727203902.60006: variable 'ansible_connection' from source: unknown 15896 1727203902.60012: variable 'ansible_module_compression' from source: unknown 15896 1727203902.60014: variable 'ansible_shell_type' from source: unknown 15896 1727203902.60019: variable 'ansible_shell_executable' from source: unknown 15896 1727203902.60021: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203902.60023: variable 'ansible_pipelining' from source: unknown 15896 1727203902.60025: variable 'ansible_timeout' from source: unknown 15896 1727203902.60027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203902.60859: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203902.60864: variable 'omit' from source: magic vars 15896 1727203902.60908: starting attempt loop 15896 1727203902.60912: running the handler 15896 1727203902.60927: _low_level_execute_command(): starting 15896 1727203902.60938: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203902.63389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203902.63393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203902.63440: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203902.63458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203902.63808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203902.65346: stdout chunk (state=3): >>>/root <<< 15896 1727203902.65466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203902.65520: stderr chunk (state=3): >>><<< 15896 1727203902.65523: stdout chunk (state=3): >>><<< 15896 1727203902.65743: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203902.65758: _low_level_execute_command(): starting 15896 1727203902.65768: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166 `" && echo ansible-tmp-1727203902.6574275-19568-153304742605166="` echo /root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166 `" ) && sleep 0' 15896 1727203902.67641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203902.67695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203902.67711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203902.67718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203902.67823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203902.70017: stdout chunk (state=3): >>>ansible-tmp-1727203902.6574275-19568-153304742605166=/root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166 <<< 15896 1727203902.70050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203902.70146: stderr chunk (state=3): >>><<< 15896 1727203902.70161: stdout chunk (state=3): >>><<< 15896 1727203902.70188: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203902.6574275-19568-153304742605166=/root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203902.70280: variable 'ansible_module_compression' from source: unknown 15896 1727203902.70400: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15896 1727203902.70442: variable 'ansible_facts' from source: unknown 15896 1727203902.70694: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/AnsiballZ_ping.py 15896 1727203902.70943: Sending initial data 15896 1727203902.71151: Sent initial data (153 bytes) 15896 1727203902.72190: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203902.72602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203902.72605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203902.72665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203902.72875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203902.74704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203902.74789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203902.75005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpigm5fibp /root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/AnsiballZ_ping.py <<< 15896 1727203902.75062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/AnsiballZ_ping.py" <<< 15896 1727203902.75182: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpigm5fibp" to remote "/root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/AnsiballZ_ping.py" <<< 15896 1727203902.77263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203902.77267: stdout chunk (state=3): >>><<< 15896 1727203902.77269: stderr chunk (state=3): >>><<< 15896 1727203902.77308: done transferring module to remote 15896 1727203902.77383: _low_level_execute_command(): starting 15896 1727203902.77387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/ /root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/AnsiballZ_ping.py && sleep 0' 15896 1727203902.78693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203902.78873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203902.78909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203902.78919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203902.79065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203902.81186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203902.81189: stderr chunk (state=3): >>><<< 15896 1727203902.81192: stdout chunk (state=3): >>><<< 15896 1727203902.81194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203902.81201: _low_level_execute_command(): starting 15896 1727203902.81203: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/AnsiballZ_ping.py && sleep 0' 15896 1727203902.82942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203902.83049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203902.83248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203902.83251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203902.99727: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15896 1727203903.01483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203903.01488: stdout chunk (state=3): >>><<< 15896 1727203903.01500: stderr chunk (state=3): >>><<< 15896 1727203903.01514: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203903.01549: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203903.01612: _low_level_execute_command(): starting 15896 1727203903.01618: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203902.6574275-19568-153304742605166/ > /dev/null 2>&1 && sleep 0' 15896 1727203903.03148: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203903.03255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203903.03263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203903.03266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203903.03269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203903.03457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203903.05492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203903.05496: stdout chunk (state=3): >>><<< 15896 1727203903.05501: stderr chunk (state=3): >>><<< 15896 1727203903.05526: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203903.05588: handler run complete 15896 1727203903.05603: attempt loop complete, returning result 15896 1727203903.05606: _execute() done 15896 1727203903.05609: dumping result to json 15896 1727203903.05617: done dumping result, returning 15896 1727203903.05627: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-fb83-b6ad-000000000130] 15896 1727203903.05633: sending task result for task 028d2410-947f-fb83-b6ad-000000000130 15896 1727203903.05945: done sending task result for task 028d2410-947f-fb83-b6ad-000000000130 15896 1727203903.05949: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 15896 1727203903.06036: no more pending results, returning what we have 15896 1727203903.06040: results queue empty 15896 1727203903.06040: checking for any_errors_fatal 15896 1727203903.06051: done checking for any_errors_fatal 15896 1727203903.06052: checking for max_fail_percentage 15896 1727203903.06054: done checking for max_fail_percentage 15896 1727203903.06055: checking to see if all hosts have failed and the running result is not ok 15896 1727203903.06055: done checking to see if all hosts have failed 15896 1727203903.06056: getting the remaining hosts for this loop 15896 1727203903.06058: done getting the remaining hosts for this loop 15896 1727203903.06064: getting the next task for host managed-node1 15896 1727203903.06076: done getting next task for host managed-node1 15896 1727203903.06080: ^ task is: TASK: meta (role_complete) 15896 1727203903.06083: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203903.06096: getting variables 15896 1727203903.06098: in VariableManager get_vars() 15896 1727203903.06154: Calling all_inventory to load vars for managed-node1 15896 1727203903.06157: Calling groups_inventory to load vars for managed-node1 15896 1727203903.06162: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203903.06173: Calling all_plugins_play to load vars for managed-node1 15896 1727203903.06398: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203903.06404: Calling groups_plugins_play to load vars for managed-node1 15896 1727203903.25704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203903.28913: done with get_vars() 15896 1727203903.29062: done getting variables 15896 1727203903.29135: done queuing things up, now waiting for results queue to drain 15896 1727203903.29137: results queue empty 15896 1727203903.29138: checking for any_errors_fatal 15896 1727203903.29141: done checking for any_errors_fatal 15896 1727203903.29142: checking for max_fail_percentage 15896 1727203903.29143: done checking for max_fail_percentage 15896 1727203903.29179: checking to see if all hosts have failed and the running result is not ok 15896 1727203903.29180: done checking to see if all hosts have failed 15896 1727203903.29181: getting the remaining hosts for this loop 15896 1727203903.29182: done getting the remaining hosts for this loop 15896 1727203903.29185: getting the next task for host managed-node1 15896 1727203903.29189: done getting next task for host managed-node1 15896 1727203903.29191: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 15896 1727203903.29193: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203903.29195: getting variables 15896 1727203903.29196: in VariableManager get_vars() 15896 1727203903.29218: Calling all_inventory to load vars for managed-node1 15896 1727203903.29221: Calling groups_inventory to load vars for managed-node1 15896 1727203903.29223: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203903.29228: Calling all_plugins_play to load vars for managed-node1 15896 1727203903.29231: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203903.29233: Calling groups_plugins_play to load vars for managed-node1 15896 1727203903.31586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203903.34831: done with get_vars() 15896 1727203903.34860: done getting variables 15896 1727203903.35021: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203903.35214: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Tuesday 24 September 2024 14:51:43 -0400 (0:00:00.798) 0:00:48.941 ***** 15896 1727203903.35240: entering _queue_task() for managed-node1/command 15896 1727203903.36036: worker is 1 (out of 1 available) 15896 1727203903.36048: exiting _queue_task() for managed-node1/command 15896 1727203903.36061: done queuing things up, now waiting for results queue to drain 15896 1727203903.36062: waiting for pending results... 15896 1727203903.36578: running TaskExecutor() for managed-node1/TASK: From the active connection, get the controller profile "bond0" 15896 1727203903.36862: in run() - task 028d2410-947f-fb83-b6ad-000000000160 15896 1727203903.37133: variable 'ansible_search_path' from source: unknown 15896 1727203903.37137: calling self._execute() 15896 1727203903.37212: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203903.37279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203903.37298: variable 'omit' from source: magic vars 15896 1727203903.38157: variable 'ansible_distribution_major_version' from source: facts 15896 1727203903.38206: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203903.38439: variable 'network_provider' from source: set_fact 15896 1727203903.38571: Evaluated conditional (network_provider == "nm"): True 15896 1727203903.38575: variable 'omit' from source: magic vars 15896 1727203903.38579: variable 'omit' from source: magic vars 15896 1727203903.38765: variable 'controller_profile' from source: play vars 15896 1727203903.38853: variable 'omit' from source: magic vars 15896 1727203903.39006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203903.39009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203903.39032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203903.39081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203903.39130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203903.39202: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203903.39229: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203903.39237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203903.39443: Set connection var ansible_shell_type to sh 15896 1727203903.39495: Set connection var ansible_connection to ssh 15896 1727203903.39507: Set connection var ansible_shell_executable to /bin/sh 15896 1727203903.39598: Set connection var ansible_pipelining to False 15896 1727203903.39601: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203903.39604: Set connection var ansible_timeout to 10 15896 1727203903.39607: variable 'ansible_shell_executable' from source: unknown 15896 1727203903.39609: variable 'ansible_connection' from source: unknown 15896 1727203903.39707: variable 'ansible_module_compression' from source: unknown 15896 1727203903.39710: variable 'ansible_shell_type' from source: unknown 15896 1727203903.39713: variable 'ansible_shell_executable' from source: unknown 15896 1727203903.39715: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203903.39717: variable 'ansible_pipelining' from source: unknown 15896 1727203903.39719: variable 'ansible_timeout' from source: unknown 15896 1727203903.39722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203903.40186: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203903.40190: variable 'omit' from source: magic vars 15896 1727203903.40192: starting attempt loop 15896 1727203903.40195: running the handler 15896 1727203903.40197: _low_level_execute_command(): starting 15896 1727203903.40199: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203903.41560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203903.41794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203903.41822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203903.41884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203903.41899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203903.41992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203903.43860: stdout chunk (state=3): >>>/root <<< 15896 1727203903.44000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203903.44006: stdout chunk (state=3): >>><<< 15896 1727203903.44093: stderr chunk (state=3): >>><<< 15896 1727203903.44124: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203903.44137: _low_level_execute_command(): starting 15896 1727203903.44145: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354 `" && echo ansible-tmp-1727203903.44122-19595-235420461299354="` echo /root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354 `" ) && sleep 0' 15896 1727203903.45796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203903.45926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203903.45941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203903.46012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203903.46176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203903.48485: stdout chunk (state=3): >>>ansible-tmp-1727203903.44122-19595-235420461299354=/root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354 <<< 15896 1727203903.48490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203903.48784: stderr chunk (state=3): >>><<< 15896 1727203903.48787: stdout chunk (state=3): >>><<< 15896 1727203903.48790: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203903.44122-19595-235420461299354=/root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203903.48792: variable 'ansible_module_compression' from source: unknown 15896 1727203903.48800: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203903.48903: variable 'ansible_facts' from source: unknown 15896 1727203903.49071: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/AnsiballZ_command.py 15896 1727203903.49572: Sending initial data 15896 1727203903.49578: Sent initial data (154 bytes) 15896 1727203903.50882: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203903.50888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203903.50929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203903.50942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203903.50957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 15896 1727203903.50964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203903.51199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203903.51305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203903.53107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203903.53113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203903.53261: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp6rbuffk4 /root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/AnsiballZ_command.py <<< 15896 1727203903.53265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/AnsiballZ_command.py" <<< 15896 1727203903.53517: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp6rbuffk4" to remote "/root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/AnsiballZ_command.py" <<< 15896 1727203903.55112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203903.55135: stderr chunk (state=3): >>><<< 15896 1727203903.55139: stdout chunk (state=3): >>><<< 15896 1727203903.55171: done transferring module to remote 15896 1727203903.55179: _low_level_execute_command(): starting 15896 1727203903.55185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/ /root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/AnsiballZ_command.py && sleep 0' 15896 1727203903.56389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203903.56395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203903.56444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203903.56450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203903.56603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203903.56755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203903.56832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203903.57126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203903.59099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203903.59237: stderr chunk (state=3): >>><<< 15896 1727203903.59246: stdout chunk (state=3): >>><<< 15896 1727203903.59454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203903.59458: _low_level_execute_command(): starting 15896 1727203903.59462: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/AnsiballZ_command.py && sleep 0' 15896 1727203903.60832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203903.60874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203903.60924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203903.60943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203903.61011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203903.61056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203903.61081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203903.61096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203903.61237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203903.79984: stdout chunk (state=3): >>> {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 8404d61d-4c80-4763-affe-7d26fa7e8dd3\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1727203893\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 8404d61d-4c80-4763-affe-7d26fa7e8dd3\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: no\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/20\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/17\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.120/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:72:1d:e2:7a:b8:96\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1727204133\nDHCP4.OPTION[7]: host_name = managed-node1\nDHCP4.OPTION[8]: ip_address = 192.0.2.120\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::13/128\nIP6.ADDRESS[2]: 2001:db8::701d:e2ff:fe7a:b896/64\nIP6.ADDRESS[3]: fe80::701d:e2ff:fe7a:b896/64\nIP6.GATEWAY: fe80::acfe:e9ff:fe43:d346\nIP6.ROUTE[1]: dst = 2001:db8::13/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::acfe:e9ff:fe43:d346, mt = 300\nIP6.DNS[1]: 2001:db8::a0df:bdff:fecb:d81\nIP6.DNS[2]: fe80::acfe:e9ff:fe43:d346\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:e4:16:27:7e:9d:ee:c1:d8:91:f3:7b:ca:31:f6:5a:92\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::a0df:bdff:fecb:d81\nDHCP6.OPTION[3]: fqdn_fqdn = managed-node1\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::13", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-24 14:51:43.775328", "end": "2024-09-24 14:51:43.795261", "delta": "0:00:00.019933", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203903.82082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203903.82086: stdout chunk (state=3): >>><<< 15896 1727203903.82089: stderr chunk (state=3): >>><<< 15896 1727203903.82092: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: 8404d61d-4c80-4763-affe-7d26fa7e8dd3\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1727203893\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: 8404d61d-4c80-4763-affe-7d26fa7e8dd3\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: no\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/20\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/17\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.120/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:72:1d:e2:7a:b8:96\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1727204133\nDHCP4.OPTION[7]: host_name = managed-node1\nDHCP4.OPTION[8]: ip_address = 192.0.2.120\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::13/128\nIP6.ADDRESS[2]: 2001:db8::701d:e2ff:fe7a:b896/64\nIP6.ADDRESS[3]: fe80::701d:e2ff:fe7a:b896/64\nIP6.GATEWAY: fe80::acfe:e9ff:fe43:d346\nIP6.ROUTE[1]: dst = 2001:db8::13/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::acfe:e9ff:fe43:d346, mt = 300\nIP6.DNS[1]: 2001:db8::a0df:bdff:fecb:d81\nIP6.DNS[2]: fe80::acfe:e9ff:fe43:d346\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:e4:16:27:7e:9d:ee:c1:d8:91:f3:7b:ca:31:f6:5a:92\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::a0df:bdff:fecb:d81\nDHCP6.OPTION[3]: fqdn_fqdn = managed-node1\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::13", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-24 14:51:43.775328", "end": "2024-09-24 14:51:43.795261", "delta": "0:00:00.019933", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203903.82100: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203903.82103: _low_level_execute_command(): starting 15896 1727203903.82106: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203903.44122-19595-235420461299354/ > /dev/null 2>&1 && sleep 0' 15896 1727203903.83977: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203903.83982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203903.83985: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203903.84067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203903.84070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203903.84073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203903.84609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203903.86413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203903.86425: stdout chunk (state=3): >>><<< 15896 1727203903.86435: stderr chunk (state=3): >>><<< 15896 1727203903.86453: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203903.86515: handler run complete 15896 1727203903.86561: Evaluated conditional (False): False 15896 1727203903.86593: attempt loop complete, returning result 15896 1727203903.86648: _execute() done 15896 1727203903.86658: dumping result to json 15896 1727203903.86680: done dumping result, returning 15896 1727203903.86886: done running TaskExecutor() for managed-node1/TASK: From the active connection, get the controller profile "bond0" [028d2410-947f-fb83-b6ad-000000000160] 15896 1727203903.86889: sending task result for task 028d2410-947f-fb83-b6ad-000000000160 ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0" ], "delta": "0:00:00.019933", "end": "2024-09-24 14:51:43.795261", "rc": 0, "start": "2024-09-24 14:51:43.775328" } STDOUT: connection.id: bond0 connection.uuid: 8404d61d-4c80-4763-affe-7d26fa7e8dd3 connection.stable-id: -- connection.type: bond connection.interface-name: nm-bond connection.autoconnect: yes connection.autoconnect-priority: 0 connection.autoconnect-retries: -1 (default) connection.multi-connect: 0 (default) connection.auth-retries: -1 connection.timestamp: 1727203893 connection.permissions: -- connection.zone: -- connection.controller: -- connection.master: -- connection.slave-type: -- connection.port-type: -- connection.autoconnect-slaves: -1 (default) connection.autoconnect-ports: -1 (default) connection.down-on-poweroff: -1 (default) connection.secondaries: -- connection.gateway-ping-timeout: 0 connection.metered: unknown connection.lldp: default connection.mdns: -1 (default) connection.llmnr: -1 (default) connection.dns-over-tls: -1 (default) connection.mptcp-flags: 0x0 (default) connection.wait-device-timeout: -1 connection.wait-activation-delay: -1 ipv4.method: auto ipv4.dns: -- ipv4.dns-search: -- ipv4.dns-options: -- ipv4.dns-priority: 0 ipv4.addresses: -- ipv4.gateway: -- ipv4.routes: -- ipv4.route-metric: 65535 ipv4.route-table: 0 (unspec) ipv4.routing-rules: -- ipv4.replace-local-rule: -1 (default) ipv4.dhcp-send-release: -1 (default) ipv4.ignore-auto-routes: no ipv4.ignore-auto-dns: no ipv4.dhcp-client-id: -- ipv4.dhcp-iaid: -- ipv4.dhcp-dscp: -- ipv4.dhcp-timeout: 0 (default) ipv4.dhcp-send-hostname: yes ipv4.dhcp-hostname: -- ipv4.dhcp-fqdn: -- ipv4.dhcp-hostname-flags: 0x0 (none) ipv4.never-default: no ipv4.may-fail: yes ipv4.required-timeout: -1 (default) ipv4.dad-timeout: -1 (default) ipv4.dhcp-vendor-class-identifier: -- ipv4.link-local: 0 (default) ipv4.dhcp-reject-servers: -- ipv4.auto-route-ext-gw: -1 (default) ipv6.method: auto ipv6.dns: -- ipv6.dns-search: -- ipv6.dns-options: -- ipv6.dns-priority: 0 ipv6.addresses: -- ipv6.gateway: -- ipv6.routes: -- ipv6.route-metric: -1 ipv6.route-table: 0 (unspec) ipv6.routing-rules: -- ipv6.replace-local-rule: -1 (default) ipv6.dhcp-send-release: -1 (default) ipv6.ignore-auto-routes: no ipv6.ignore-auto-dns: no ipv6.never-default: no ipv6.may-fail: yes ipv6.required-timeout: -1 (default) ipv6.ip6-privacy: -1 (default) ipv6.temp-valid-lifetime: 0 (default) ipv6.temp-preferred-lifetime: 0 (default) ipv6.addr-gen-mode: default ipv6.ra-timeout: 0 (default) ipv6.mtu: auto ipv6.dhcp-pd-hint: -- ipv6.dhcp-duid: -- ipv6.dhcp-iaid: -- ipv6.dhcp-timeout: 0 (default) ipv6.dhcp-send-hostname: yes ipv6.dhcp-hostname: -- ipv6.dhcp-hostname-flags: 0x0 (none) ipv6.auto-route-ext-gw: -1 (default) ipv6.token: -- bond.options: mode=active-backup,miimon=110 proxy.method: none proxy.browser-only: no proxy.pac-url: -- proxy.pac-script: -- GENERAL.NAME: bond0 GENERAL.UUID: 8404d61d-4c80-4763-affe-7d26fa7e8dd3 GENERAL.DEVICES: nm-bond GENERAL.IP-IFACE: nm-bond GENERAL.STATE: activated GENERAL.DEFAULT: no GENERAL.DEFAULT6: no GENERAL.SPEC-OBJECT: -- GENERAL.VPN: no GENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/20 GENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/17 GENERAL.ZONE: -- GENERAL.MASTER-PATH: -- IP4.ADDRESS[1]: 192.0.2.120/24 IP4.GATEWAY: 192.0.2.1 IP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535 IP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535 IP4.DNS[1]: 192.0.2.1 DHCP4.OPTION[1]: broadcast_address = 192.0.2.255 DHCP4.OPTION[2]: dhcp_client_identifier = 01:72:1d:e2:7a:b8:96 DHCP4.OPTION[3]: dhcp_lease_time = 240 DHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1 DHCP4.OPTION[5]: domain_name_servers = 192.0.2.1 DHCP4.OPTION[6]: expiry = 1727204133 DHCP4.OPTION[7]: host_name = managed-node1 DHCP4.OPTION[8]: ip_address = 192.0.2.120 DHCP4.OPTION[9]: next_server = 192.0.2.1 DHCP4.OPTION[10]: requested_broadcast_address = 1 DHCP4.OPTION[11]: requested_domain_name = 1 DHCP4.OPTION[12]: requested_domain_name_servers = 1 DHCP4.OPTION[13]: requested_domain_search = 1 DHCP4.OPTION[14]: requested_host_name = 1 DHCP4.OPTION[15]: requested_interface_mtu = 1 DHCP4.OPTION[16]: requested_ms_classless_static_routes = 1 DHCP4.OPTION[17]: requested_nis_domain = 1 DHCP4.OPTION[18]: requested_nis_servers = 1 DHCP4.OPTION[19]: requested_ntp_servers = 1 DHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1 DHCP4.OPTION[21]: requested_root_path = 1 DHCP4.OPTION[22]: requested_routers = 1 DHCP4.OPTION[23]: requested_static_routes = 1 DHCP4.OPTION[24]: requested_subnet_mask = 1 DHCP4.OPTION[25]: requested_time_offset = 1 DHCP4.OPTION[26]: requested_wpad = 1 DHCP4.OPTION[27]: routers = 192.0.2.1 DHCP4.OPTION[28]: subnet_mask = 255.255.255.0 IP6.ADDRESS[1]: 2001:db8::13/128 IP6.ADDRESS[2]: 2001:db8::701d:e2ff:fe7a:b896/64 IP6.ADDRESS[3]: fe80::701d:e2ff:fe7a:b896/64 IP6.GATEWAY: fe80::acfe:e9ff:fe43:d346 IP6.ROUTE[1]: dst = 2001:db8::13/128, nh = ::, mt = 300 IP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300 IP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024 IP6.ROUTE[4]: dst = ::/0, nh = fe80::acfe:e9ff:fe43:d346, mt = 300 IP6.DNS[1]: 2001:db8::a0df:bdff:fecb:d81 IP6.DNS[2]: fe80::acfe:e9ff:fe43:d346 DHCP6.OPTION[1]: dhcp6_client_id = 00:04:e4:16:27:7e:9d:ee:c1:d8:91:f3:7b:ca:31:f6:5a:92 DHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::a0df:bdff:fecb:d81 DHCP6.OPTION[3]: fqdn_fqdn = managed-node1 DHCP6.OPTION[4]: iaid = 8c:3b:13:c0 DHCP6.OPTION[5]: ip6_address = 2001:db8::13 15896 1727203903.87599: no more pending results, returning what we have 15896 1727203903.87617: results queue empty 15896 1727203903.87619: checking for any_errors_fatal 15896 1727203903.87637: done checking for any_errors_fatal 15896 1727203903.87638: checking for max_fail_percentage 15896 1727203903.87640: done checking for max_fail_percentage 15896 1727203903.87642: checking to see if all hosts have failed and the running result is not ok 15896 1727203903.87642: done checking to see if all hosts have failed 15896 1727203903.87798: getting the remaining hosts for this loop 15896 1727203903.87801: done getting the remaining hosts for this loop 15896 1727203903.87838: getting the next task for host managed-node1 15896 1727203903.87850: done getting next task for host managed-node1 15896 1727203903.87854: ^ task is: TASK: Assert that the controller profile is activated 15896 1727203903.87857: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203903.87864: getting variables 15896 1727203903.87867: in VariableManager get_vars() 15896 1727203903.88082: done sending task result for task 028d2410-947f-fb83-b6ad-000000000160 15896 1727203903.88086: WORKER PROCESS EXITING 15896 1727203903.88301: Calling all_inventory to load vars for managed-node1 15896 1727203903.88304: Calling groups_inventory to load vars for managed-node1 15896 1727203903.88306: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203903.88317: Calling all_plugins_play to load vars for managed-node1 15896 1727203903.88320: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203903.88323: Calling groups_plugins_play to load vars for managed-node1 15896 1727203903.93733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203903.98408: done with get_vars() 15896 1727203903.98485: done getting variables 15896 1727203903.98667: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Tuesday 24 September 2024 14:51:43 -0400 (0:00:00.634) 0:00:49.576 ***** 15896 1727203903.98702: entering _queue_task() for managed-node1/assert 15896 1727203903.99967: worker is 1 (out of 1 available) 15896 1727203903.99981: exiting _queue_task() for managed-node1/assert 15896 1727203903.99991: done queuing things up, now waiting for results queue to drain 15896 1727203903.99993: waiting for pending results... 15896 1727203904.00348: running TaskExecutor() for managed-node1/TASK: Assert that the controller profile is activated 15896 1727203904.00673: in run() - task 028d2410-947f-fb83-b6ad-000000000161 15896 1727203904.00743: variable 'ansible_search_path' from source: unknown 15896 1727203904.00933: calling self._execute() 15896 1727203904.01332: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.01336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.01340: variable 'omit' from source: magic vars 15896 1727203904.02600: variable 'ansible_distribution_major_version' from source: facts 15896 1727203904.02642: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203904.03085: variable 'network_provider' from source: set_fact 15896 1727203904.03091: Evaluated conditional (network_provider == "nm"): True 15896 1727203904.03102: variable 'omit' from source: magic vars 15896 1727203904.03256: variable 'omit' from source: magic vars 15896 1727203904.03588: variable 'controller_profile' from source: play vars 15896 1727203904.03592: variable 'omit' from source: magic vars 15896 1727203904.03786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203904.03789: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203904.04041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203904.04044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203904.04046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203904.04052: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203904.04054: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.04056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.04543: Set connection var ansible_shell_type to sh 15896 1727203904.04610: Set connection var ansible_connection to ssh 15896 1727203904.04649: Set connection var ansible_shell_executable to /bin/sh 15896 1727203904.04691: Set connection var ansible_pipelining to False 15896 1727203904.04703: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203904.04752: Set connection var ansible_timeout to 10 15896 1727203904.05100: variable 'ansible_shell_executable' from source: unknown 15896 1727203904.05103: variable 'ansible_connection' from source: unknown 15896 1727203904.05106: variable 'ansible_module_compression' from source: unknown 15896 1727203904.05109: variable 'ansible_shell_type' from source: unknown 15896 1727203904.05111: variable 'ansible_shell_executable' from source: unknown 15896 1727203904.05113: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.05115: variable 'ansible_pipelining' from source: unknown 15896 1727203904.05118: variable 'ansible_timeout' from source: unknown 15896 1727203904.05121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.05499: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203904.05566: variable 'omit' from source: magic vars 15896 1727203904.05614: starting attempt loop 15896 1727203904.05642: running the handler 15896 1727203904.06023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203904.09062: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203904.09179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203904.09294: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203904.09342: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203904.09385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203904.09482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203904.09533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203904.09587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203904.09641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203904.09664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203904.10017: variable 'active_controller_profile' from source: set_fact 15896 1727203904.10126: Evaluated conditional (active_controller_profile.stdout | length != 0): True 15896 1727203904.10202: handler run complete 15896 1727203904.10587: attempt loop complete, returning result 15896 1727203904.10590: _execute() done 15896 1727203904.10593: dumping result to json 15896 1727203904.10596: done dumping result, returning 15896 1727203904.10598: done running TaskExecutor() for managed-node1/TASK: Assert that the controller profile is activated [028d2410-947f-fb83-b6ad-000000000161] 15896 1727203904.10600: sending task result for task 028d2410-947f-fb83-b6ad-000000000161 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 15896 1727203904.10751: no more pending results, returning what we have 15896 1727203904.10755: results queue empty 15896 1727203904.10756: checking for any_errors_fatal 15896 1727203904.10774: done checking for any_errors_fatal 15896 1727203904.10777: checking for max_fail_percentage 15896 1727203904.10780: done checking for max_fail_percentage 15896 1727203904.10781: checking to see if all hosts have failed and the running result is not ok 15896 1727203904.10781: done checking to see if all hosts have failed 15896 1727203904.10782: getting the remaining hosts for this loop 15896 1727203904.10783: done getting the remaining hosts for this loop 15896 1727203904.10787: getting the next task for host managed-node1 15896 1727203904.10803: done getting next task for host managed-node1 15896 1727203904.10810: ^ task is: TASK: Get the controller device details 15896 1727203904.10812: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203904.10895: getting variables 15896 1727203904.10898: in VariableManager get_vars() 15896 1727203904.11162: Calling all_inventory to load vars for managed-node1 15896 1727203904.11173: Calling groups_inventory to load vars for managed-node1 15896 1727203904.11178: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203904.11190: Calling all_plugins_play to load vars for managed-node1 15896 1727203904.11193: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203904.11197: Calling groups_plugins_play to load vars for managed-node1 15896 1727203904.11883: done sending task result for task 028d2410-947f-fb83-b6ad-000000000161 15896 1727203904.11886: WORKER PROCESS EXITING 15896 1727203904.14542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203904.18202: done with get_vars() 15896 1727203904.18228: done getting variables 15896 1727203904.18314: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.196) 0:00:49.772 ***** 15896 1727203904.18354: entering _queue_task() for managed-node1/command 15896 1727203904.18863: worker is 1 (out of 1 available) 15896 1727203904.18880: exiting _queue_task() for managed-node1/command 15896 1727203904.18891: done queuing things up, now waiting for results queue to drain 15896 1727203904.18893: waiting for pending results... 15896 1727203904.19222: running TaskExecutor() for managed-node1/TASK: Get the controller device details 15896 1727203904.19350: in run() - task 028d2410-947f-fb83-b6ad-000000000162 15896 1727203904.19373: variable 'ansible_search_path' from source: unknown 15896 1727203904.19424: calling self._execute() 15896 1727203904.19550: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.19564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.19583: variable 'omit' from source: magic vars 15896 1727203904.20287: variable 'ansible_distribution_major_version' from source: facts 15896 1727203904.20291: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203904.20367: variable 'network_provider' from source: set_fact 15896 1727203904.20381: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203904.20397: when evaluation is False, skipping this task 15896 1727203904.20404: _execute() done 15896 1727203904.20411: dumping result to json 15896 1727203904.20418: done dumping result, returning 15896 1727203904.20427: done running TaskExecutor() for managed-node1/TASK: Get the controller device details [028d2410-947f-fb83-b6ad-000000000162] 15896 1727203904.20436: sending task result for task 028d2410-947f-fb83-b6ad-000000000162 skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203904.20757: no more pending results, returning what we have 15896 1727203904.20765: results queue empty 15896 1727203904.20766: checking for any_errors_fatal 15896 1727203904.20774: done checking for any_errors_fatal 15896 1727203904.20775: checking for max_fail_percentage 15896 1727203904.20779: done checking for max_fail_percentage 15896 1727203904.20780: checking to see if all hosts have failed and the running result is not ok 15896 1727203904.20781: done checking to see if all hosts have failed 15896 1727203904.20781: getting the remaining hosts for this loop 15896 1727203904.20784: done getting the remaining hosts for this loop 15896 1727203904.20787: getting the next task for host managed-node1 15896 1727203904.20794: done getting next task for host managed-node1 15896 1727203904.20797: ^ task is: TASK: Assert that the controller profile is activated 15896 1727203904.20800: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203904.20804: getting variables 15896 1727203904.20806: in VariableManager get_vars() 15896 1727203904.20868: Calling all_inventory to load vars for managed-node1 15896 1727203904.20872: Calling groups_inventory to load vars for managed-node1 15896 1727203904.20874: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203904.21116: Calling all_plugins_play to load vars for managed-node1 15896 1727203904.21120: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203904.21124: Calling groups_plugins_play to load vars for managed-node1 15896 1727203904.21725: done sending task result for task 028d2410-947f-fb83-b6ad-000000000162 15896 1727203904.21729: WORKER PROCESS EXITING 15896 1727203904.24097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203904.26573: done with get_vars() 15896 1727203904.26603: done getting variables 15896 1727203904.26727: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.084) 0:00:49.857 ***** 15896 1727203904.26820: entering _queue_task() for managed-node1/assert 15896 1727203904.27748: worker is 1 (out of 1 available) 15896 1727203904.27763: exiting _queue_task() for managed-node1/assert 15896 1727203904.27823: done queuing things up, now waiting for results queue to drain 15896 1727203904.27825: waiting for pending results... 15896 1727203904.28283: running TaskExecutor() for managed-node1/TASK: Assert that the controller profile is activated 15896 1727203904.28467: in run() - task 028d2410-947f-fb83-b6ad-000000000163 15896 1727203904.28479: variable 'ansible_search_path' from source: unknown 15896 1727203904.28665: calling self._execute() 15896 1727203904.28822: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.28829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.28957: variable 'omit' from source: magic vars 15896 1727203904.29796: variable 'ansible_distribution_major_version' from source: facts 15896 1727203904.29808: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203904.29926: variable 'network_provider' from source: set_fact 15896 1727203904.29932: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203904.29996: when evaluation is False, skipping this task 15896 1727203904.30001: _execute() done 15896 1727203904.30079: dumping result to json 15896 1727203904.30082: done dumping result, returning 15896 1727203904.30091: done running TaskExecutor() for managed-node1/TASK: Assert that the controller profile is activated [028d2410-947f-fb83-b6ad-000000000163] 15896 1727203904.30097: sending task result for task 028d2410-947f-fb83-b6ad-000000000163 15896 1727203904.30413: done sending task result for task 028d2410-947f-fb83-b6ad-000000000163 15896 1727203904.30416: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203904.30471: no more pending results, returning what we have 15896 1727203904.30477: results queue empty 15896 1727203904.30478: checking for any_errors_fatal 15896 1727203904.30484: done checking for any_errors_fatal 15896 1727203904.30485: checking for max_fail_percentage 15896 1727203904.30487: done checking for max_fail_percentage 15896 1727203904.30488: checking to see if all hosts have failed and the running result is not ok 15896 1727203904.30489: done checking to see if all hosts have failed 15896 1727203904.30490: getting the remaining hosts for this loop 15896 1727203904.30491: done getting the remaining hosts for this loop 15896 1727203904.30494: getting the next task for host managed-node1 15896 1727203904.30504: done getting next task for host managed-node1 15896 1727203904.30510: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203904.30514: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203904.30545: getting variables 15896 1727203904.30547: in VariableManager get_vars() 15896 1727203904.30611: Calling all_inventory to load vars for managed-node1 15896 1727203904.30614: Calling groups_inventory to load vars for managed-node1 15896 1727203904.30616: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203904.30628: Calling all_plugins_play to load vars for managed-node1 15896 1727203904.30632: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203904.30755: Calling groups_plugins_play to load vars for managed-node1 15896 1727203904.34197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203904.35913: done with get_vars() 15896 1727203904.35943: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.092) 0:00:49.950 ***** 15896 1727203904.36069: entering _queue_task() for managed-node1/include_tasks 15896 1727203904.36448: worker is 1 (out of 1 available) 15896 1727203904.36462: exiting _queue_task() for managed-node1/include_tasks 15896 1727203904.36475: done queuing things up, now waiting for results queue to drain 15896 1727203904.36480: waiting for pending results... 15896 1727203904.36875: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15896 1727203904.36922: in run() - task 028d2410-947f-fb83-b6ad-00000000016c 15896 1727203904.36943: variable 'ansible_search_path' from source: unknown 15896 1727203904.36950: variable 'ansible_search_path' from source: unknown 15896 1727203904.37005: calling self._execute() 15896 1727203904.37214: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.37217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.37220: variable 'omit' from source: magic vars 15896 1727203904.37685: variable 'ansible_distribution_major_version' from source: facts 15896 1727203904.37689: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203904.37691: _execute() done 15896 1727203904.37692: dumping result to json 15896 1727203904.37698: done dumping result, returning 15896 1727203904.37707: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-fb83-b6ad-00000000016c] 15896 1727203904.37716: sending task result for task 028d2410-947f-fb83-b6ad-00000000016c 15896 1727203904.37943: no more pending results, returning what we have 15896 1727203904.37948: in VariableManager get_vars() 15896 1727203904.38013: Calling all_inventory to load vars for managed-node1 15896 1727203904.38016: Calling groups_inventory to load vars for managed-node1 15896 1727203904.38018: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203904.38031: Calling all_plugins_play to load vars for managed-node1 15896 1727203904.38034: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203904.38037: Calling groups_plugins_play to load vars for managed-node1 15896 1727203904.38788: done sending task result for task 028d2410-947f-fb83-b6ad-00000000016c 15896 1727203904.38792: WORKER PROCESS EXITING 15896 1727203904.40507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203904.41781: done with get_vars() 15896 1727203904.41796: variable 'ansible_search_path' from source: unknown 15896 1727203904.41797: variable 'ansible_search_path' from source: unknown 15896 1727203904.41824: we have included files to process 15896 1727203904.41825: generating all_blocks data 15896 1727203904.41827: done generating all_blocks data 15896 1727203904.41831: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203904.41831: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203904.41833: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15896 1727203904.42237: done processing included file 15896 1727203904.42239: iterating over new_blocks loaded from include file 15896 1727203904.42241: in VariableManager get_vars() 15896 1727203904.42264: done with get_vars() 15896 1727203904.42266: filtering new block on tags 15896 1727203904.42287: done filtering new block on tags 15896 1727203904.42289: in VariableManager get_vars() 15896 1727203904.42310: done with get_vars() 15896 1727203904.42312: filtering new block on tags 15896 1727203904.42337: done filtering new block on tags 15896 1727203904.42339: in VariableManager get_vars() 15896 1727203904.42357: done with get_vars() 15896 1727203904.42358: filtering new block on tags 15896 1727203904.42383: done filtering new block on tags 15896 1727203904.42384: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 15896 1727203904.42389: extending task lists for all hosts with included blocks 15896 1727203904.42982: done extending task lists 15896 1727203904.42983: done processing included files 15896 1727203904.42984: results queue empty 15896 1727203904.42984: checking for any_errors_fatal 15896 1727203904.42987: done checking for any_errors_fatal 15896 1727203904.42988: checking for max_fail_percentage 15896 1727203904.42988: done checking for max_fail_percentage 15896 1727203904.42989: checking to see if all hosts have failed and the running result is not ok 15896 1727203904.42989: done checking to see if all hosts have failed 15896 1727203904.42990: getting the remaining hosts for this loop 15896 1727203904.42991: done getting the remaining hosts for this loop 15896 1727203904.42993: getting the next task for host managed-node1 15896 1727203904.42997: done getting next task for host managed-node1 15896 1727203904.42999: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203904.43001: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203904.43008: getting variables 15896 1727203904.43009: in VariableManager get_vars() 15896 1727203904.43022: Calling all_inventory to load vars for managed-node1 15896 1727203904.43023: Calling groups_inventory to load vars for managed-node1 15896 1727203904.43024: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203904.43028: Calling all_plugins_play to load vars for managed-node1 15896 1727203904.43029: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203904.43031: Calling groups_plugins_play to load vars for managed-node1 15896 1727203904.44193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203904.45685: done with get_vars() 15896 1727203904.45706: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.096) 0:00:50.046 ***** 15896 1727203904.45764: entering _queue_task() for managed-node1/setup 15896 1727203904.46042: worker is 1 (out of 1 available) 15896 1727203904.46055: exiting _queue_task() for managed-node1/setup 15896 1727203904.46068: done queuing things up, now waiting for results queue to drain 15896 1727203904.46069: waiting for pending results... 15896 1727203904.46272: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15896 1727203904.46378: in run() - task 028d2410-947f-fb83-b6ad-000000000914 15896 1727203904.46391: variable 'ansible_search_path' from source: unknown 15896 1727203904.46395: variable 'ansible_search_path' from source: unknown 15896 1727203904.46425: calling self._execute() 15896 1727203904.46508: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.46513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.46531: variable 'omit' from source: magic vars 15896 1727203904.46800: variable 'ansible_distribution_major_version' from source: facts 15896 1727203904.46808: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203904.47182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203904.49536: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203904.49621: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203904.49679: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203904.49715: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203904.49744: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203904.49844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203904.49917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203904.49956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203904.50005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203904.50010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203904.50065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203904.50082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203904.50107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203904.50130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203904.50140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203904.50256: variable '__network_required_facts' from source: role '' defaults 15896 1727203904.50264: variable 'ansible_facts' from source: unknown 15896 1727203904.50993: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15896 1727203904.50998: when evaluation is False, skipping this task 15896 1727203904.51000: _execute() done 15896 1727203904.51003: dumping result to json 15896 1727203904.51005: done dumping result, returning 15896 1727203904.51010: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-fb83-b6ad-000000000914] 15896 1727203904.51016: sending task result for task 028d2410-947f-fb83-b6ad-000000000914 15896 1727203904.51281: done sending task result for task 028d2410-947f-fb83-b6ad-000000000914 15896 1727203904.51284: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203904.51324: no more pending results, returning what we have 15896 1727203904.51328: results queue empty 15896 1727203904.51328: checking for any_errors_fatal 15896 1727203904.51330: done checking for any_errors_fatal 15896 1727203904.51330: checking for max_fail_percentage 15896 1727203904.51332: done checking for max_fail_percentage 15896 1727203904.51333: checking to see if all hosts have failed and the running result is not ok 15896 1727203904.51333: done checking to see if all hosts have failed 15896 1727203904.51334: getting the remaining hosts for this loop 15896 1727203904.51335: done getting the remaining hosts for this loop 15896 1727203904.51338: getting the next task for host managed-node1 15896 1727203904.51347: done getting next task for host managed-node1 15896 1727203904.51349: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203904.51354: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203904.51373: getting variables 15896 1727203904.51375: in VariableManager get_vars() 15896 1727203904.51423: Calling all_inventory to load vars for managed-node1 15896 1727203904.51426: Calling groups_inventory to load vars for managed-node1 15896 1727203904.51428: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203904.51437: Calling all_plugins_play to load vars for managed-node1 15896 1727203904.51440: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203904.51443: Calling groups_plugins_play to load vars for managed-node1 15896 1727203904.52285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203904.53835: done with get_vars() 15896 1727203904.53858: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.081) 0:00:50.128 ***** 15896 1727203904.53952: entering _queue_task() for managed-node1/stat 15896 1727203904.54246: worker is 1 (out of 1 available) 15896 1727203904.54259: exiting _queue_task() for managed-node1/stat 15896 1727203904.54271: done queuing things up, now waiting for results queue to drain 15896 1727203904.54272: waiting for pending results... 15896 1727203904.54624: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15896 1727203904.54884: in run() - task 028d2410-947f-fb83-b6ad-000000000916 15896 1727203904.54888: variable 'ansible_search_path' from source: unknown 15896 1727203904.54892: variable 'ansible_search_path' from source: unknown 15896 1727203904.54900: calling self._execute() 15896 1727203904.54937: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.54949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.54963: variable 'omit' from source: magic vars 15896 1727203904.55427: variable 'ansible_distribution_major_version' from source: facts 15896 1727203904.55457: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203904.55658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203904.55964: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203904.56010: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203904.56036: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203904.56074: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203904.56137: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203904.56156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203904.56175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203904.56199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203904.56262: variable '__network_is_ostree' from source: set_fact 15896 1727203904.56265: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203904.56268: when evaluation is False, skipping this task 15896 1727203904.56271: _execute() done 15896 1727203904.56273: dumping result to json 15896 1727203904.56277: done dumping result, returning 15896 1727203904.56285: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-fb83-b6ad-000000000916] 15896 1727203904.56290: sending task result for task 028d2410-947f-fb83-b6ad-000000000916 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203904.56583: no more pending results, returning what we have 15896 1727203904.56587: results queue empty 15896 1727203904.56588: checking for any_errors_fatal 15896 1727203904.56594: done checking for any_errors_fatal 15896 1727203904.56598: checking for max_fail_percentage 15896 1727203904.56600: done checking for max_fail_percentage 15896 1727203904.56600: checking to see if all hosts have failed and the running result is not ok 15896 1727203904.56601: done checking to see if all hosts have failed 15896 1727203904.56602: getting the remaining hosts for this loop 15896 1727203904.56603: done getting the remaining hosts for this loop 15896 1727203904.56606: getting the next task for host managed-node1 15896 1727203904.56613: done getting next task for host managed-node1 15896 1727203904.56616: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203904.56621: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203904.56642: getting variables 15896 1727203904.56643: in VariableManager get_vars() 15896 1727203904.56720: Calling all_inventory to load vars for managed-node1 15896 1727203904.56723: Calling groups_inventory to load vars for managed-node1 15896 1727203904.56726: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203904.56741: Calling all_plugins_play to load vars for managed-node1 15896 1727203904.56744: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203904.56752: done sending task result for task 028d2410-947f-fb83-b6ad-000000000916 15896 1727203904.56756: WORKER PROCESS EXITING 15896 1727203904.56767: Calling groups_plugins_play to load vars for managed-node1 15896 1727203904.58087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203904.59830: done with get_vars() 15896 1727203904.59853: done getting variables 15896 1727203904.59916: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.059) 0:00:50.188 ***** 15896 1727203904.59953: entering _queue_task() for managed-node1/set_fact 15896 1727203904.60266: worker is 1 (out of 1 available) 15896 1727203904.60480: exiting _queue_task() for managed-node1/set_fact 15896 1727203904.60491: done queuing things up, now waiting for results queue to drain 15896 1727203904.60493: waiting for pending results... 15896 1727203904.60619: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15896 1727203904.60750: in run() - task 028d2410-947f-fb83-b6ad-000000000917 15896 1727203904.60770: variable 'ansible_search_path' from source: unknown 15896 1727203904.60825: variable 'ansible_search_path' from source: unknown 15896 1727203904.60828: calling self._execute() 15896 1727203904.60944: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.60958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.60982: variable 'omit' from source: magic vars 15896 1727203904.61368: variable 'ansible_distribution_major_version' from source: facts 15896 1727203904.61388: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203904.61583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203904.61825: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203904.61911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203904.61945: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203904.61973: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203904.62041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203904.62058: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203904.62080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203904.62098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203904.62167: variable '__network_is_ostree' from source: set_fact 15896 1727203904.62173: Evaluated conditional (not __network_is_ostree is defined): False 15896 1727203904.62177: when evaluation is False, skipping this task 15896 1727203904.62180: _execute() done 15896 1727203904.62183: dumping result to json 15896 1727203904.62185: done dumping result, returning 15896 1727203904.62193: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-fb83-b6ad-000000000917] 15896 1727203904.62196: sending task result for task 028d2410-947f-fb83-b6ad-000000000917 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15896 1727203904.62328: no more pending results, returning what we have 15896 1727203904.62331: results queue empty 15896 1727203904.62332: checking for any_errors_fatal 15896 1727203904.62350: done checking for any_errors_fatal 15896 1727203904.62351: checking for max_fail_percentage 15896 1727203904.62353: done checking for max_fail_percentage 15896 1727203904.62354: checking to see if all hosts have failed and the running result is not ok 15896 1727203904.62354: done checking to see if all hosts have failed 15896 1727203904.62355: getting the remaining hosts for this loop 15896 1727203904.62357: done getting the remaining hosts for this loop 15896 1727203904.62362: getting the next task for host managed-node1 15896 1727203904.62371: done getting next task for host managed-node1 15896 1727203904.62374: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203904.62381: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203904.62391: done sending task result for task 028d2410-947f-fb83-b6ad-000000000917 15896 1727203904.62394: WORKER PROCESS EXITING 15896 1727203904.62407: getting variables 15896 1727203904.62408: in VariableManager get_vars() 15896 1727203904.62463: Calling all_inventory to load vars for managed-node1 15896 1727203904.62465: Calling groups_inventory to load vars for managed-node1 15896 1727203904.62467: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203904.62477: Calling all_plugins_play to load vars for managed-node1 15896 1727203904.62479: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203904.62482: Calling groups_plugins_play to load vars for managed-node1 15896 1727203904.63412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203904.64287: done with get_vars() 15896 1727203904.64305: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:51:44 -0400 (0:00:00.044) 0:00:50.233 ***** 15896 1727203904.64371: entering _queue_task() for managed-node1/service_facts 15896 1727203904.64711: worker is 1 (out of 1 available) 15896 1727203904.64723: exiting _queue_task() for managed-node1/service_facts 15896 1727203904.64736: done queuing things up, now waiting for results queue to drain 15896 1727203904.64737: waiting for pending results... 15896 1727203904.65111: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 15896 1727203904.65225: in run() - task 028d2410-947f-fb83-b6ad-000000000919 15896 1727203904.65246: variable 'ansible_search_path' from source: unknown 15896 1727203904.65254: variable 'ansible_search_path' from source: unknown 15896 1727203904.65301: calling self._execute() 15896 1727203904.65408: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.65436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.65455: variable 'omit' from source: magic vars 15896 1727203904.65994: variable 'ansible_distribution_major_version' from source: facts 15896 1727203904.66009: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203904.66018: variable 'omit' from source: magic vars 15896 1727203904.66174: variable 'omit' from source: magic vars 15896 1727203904.66179: variable 'omit' from source: magic vars 15896 1727203904.66183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203904.66213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203904.66237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203904.66260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203904.66282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203904.66314: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203904.66322: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.66328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.66435: Set connection var ansible_shell_type to sh 15896 1727203904.66448: Set connection var ansible_connection to ssh 15896 1727203904.66457: Set connection var ansible_shell_executable to /bin/sh 15896 1727203904.66467: Set connection var ansible_pipelining to False 15896 1727203904.66478: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203904.66489: Set connection var ansible_timeout to 10 15896 1727203904.66518: variable 'ansible_shell_executable' from source: unknown 15896 1727203904.66580: variable 'ansible_connection' from source: unknown 15896 1727203904.66584: variable 'ansible_module_compression' from source: unknown 15896 1727203904.66586: variable 'ansible_shell_type' from source: unknown 15896 1727203904.66588: variable 'ansible_shell_executable' from source: unknown 15896 1727203904.66590: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203904.66592: variable 'ansible_pipelining' from source: unknown 15896 1727203904.66594: variable 'ansible_timeout' from source: unknown 15896 1727203904.66596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203904.66763: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203904.66781: variable 'omit' from source: magic vars 15896 1727203904.66791: starting attempt loop 15896 1727203904.66797: running the handler 15896 1727203904.66815: _low_level_execute_command(): starting 15896 1727203904.66832: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203904.67548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203904.67586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203904.67600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203904.67695: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203904.67719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203904.67835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203904.69624: stdout chunk (state=3): >>>/root <<< 15896 1727203904.69772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203904.69788: stderr chunk (state=3): >>><<< 15896 1727203904.69798: stdout chunk (state=3): >>><<< 15896 1727203904.69828: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203904.69881: _low_level_execute_command(): starting 15896 1727203904.69885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464 `" && echo ansible-tmp-1727203904.698465-19665-124229906300464="` echo /root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464 `" ) && sleep 0' 15896 1727203904.70538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203904.70557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203904.70595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203904.70609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203904.70678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203904.70683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203904.70771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203904.72826: stdout chunk (state=3): >>>ansible-tmp-1727203904.698465-19665-124229906300464=/root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464 <<< 15896 1727203904.72936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203904.72962: stderr chunk (state=3): >>><<< 15896 1727203904.72966: stdout chunk (state=3): >>><<< 15896 1727203904.72979: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203904.698465-19665-124229906300464=/root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203904.73046: variable 'ansible_module_compression' from source: unknown 15896 1727203904.73094: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15896 1727203904.73153: variable 'ansible_facts' from source: unknown 15896 1727203904.73217: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/AnsiballZ_service_facts.py 15896 1727203904.73410: Sending initial data 15896 1727203904.73423: Sent initial data (161 bytes) 15896 1727203904.74466: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203904.74470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203904.74512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203904.74522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203904.74546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203904.74643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203904.76380: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203904.76451: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203904.76529: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmptqlze8x1 /root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/AnsiballZ_service_facts.py <<< 15896 1727203904.76532: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/AnsiballZ_service_facts.py" <<< 15896 1727203904.76601: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmptqlze8x1" to remote "/root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/AnsiballZ_service_facts.py" <<< 15896 1727203904.77552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203904.77556: stdout chunk (state=3): >>><<< 15896 1727203904.77558: stderr chunk (state=3): >>><<< 15896 1727203904.77661: done transferring module to remote 15896 1727203904.77666: _low_level_execute_command(): starting 15896 1727203904.77668: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/ /root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/AnsiballZ_service_facts.py && sleep 0' 15896 1727203904.78766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203904.78924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203904.79052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203904.81432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203904.81436: stdout chunk (state=3): >>><<< 15896 1727203904.81443: stderr chunk (state=3): >>><<< 15896 1727203904.81481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203904.81490: _low_level_execute_command(): starting 15896 1727203904.81493: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/AnsiballZ_service_facts.py && sleep 0' 15896 1727203904.82582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203904.82597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203904.82623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203904.82696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203904.82749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203904.82770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203904.82791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203904.83305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203906.60228: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 15896 1727203906.60334: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 15896 1727203906.60345: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15896 1727203906.62058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203906.62245: stderr chunk (state=3): >>><<< 15896 1727203906.62248: stdout chunk (state=3): >>><<< 15896 1727203906.62257: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203906.63094: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203906.63137: _low_level_execute_command(): starting 15896 1727203906.63140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203904.698465-19665-124229906300464/ > /dev/null 2>&1 && sleep 0' 15896 1727203906.63883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203906.63920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203906.63938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203906.63957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203906.64089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203906.66010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203906.66034: stderr chunk (state=3): >>><<< 15896 1727203906.66037: stdout chunk (state=3): >>><<< 15896 1727203906.66054: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203906.66059: handler run complete 15896 1727203906.66182: variable 'ansible_facts' from source: unknown 15896 1727203906.66280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203906.66567: variable 'ansible_facts' from source: unknown 15896 1727203906.66738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203906.66988: attempt loop complete, returning result 15896 1727203906.66991: _execute() done 15896 1727203906.66993: dumping result to json 15896 1727203906.67002: done dumping result, returning 15896 1727203906.67014: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-fb83-b6ad-000000000919] 15896 1727203906.67022: sending task result for task 028d2410-947f-fb83-b6ad-000000000919 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203906.67916: no more pending results, returning what we have 15896 1727203906.67918: results queue empty 15896 1727203906.67919: checking for any_errors_fatal 15896 1727203906.67924: done checking for any_errors_fatal 15896 1727203906.67925: checking for max_fail_percentage 15896 1727203906.67926: done checking for max_fail_percentage 15896 1727203906.67927: checking to see if all hosts have failed and the running result is not ok 15896 1727203906.67928: done checking to see if all hosts have failed 15896 1727203906.67928: getting the remaining hosts for this loop 15896 1727203906.67930: done getting the remaining hosts for this loop 15896 1727203906.67933: getting the next task for host managed-node1 15896 1727203906.67939: done getting next task for host managed-node1 15896 1727203906.67942: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203906.67946: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203906.68113: getting variables 15896 1727203906.68115: in VariableManager get_vars() 15896 1727203906.68163: Calling all_inventory to load vars for managed-node1 15896 1727203906.68166: Calling groups_inventory to load vars for managed-node1 15896 1727203906.68168: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203906.68180: Calling all_plugins_play to load vars for managed-node1 15896 1727203906.68183: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203906.68186: Calling groups_plugins_play to load vars for managed-node1 15896 1727203906.68709: done sending task result for task 028d2410-947f-fb83-b6ad-000000000919 15896 1727203906.68713: WORKER PROCESS EXITING 15896 1727203906.69344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203906.70234: done with get_vars() 15896 1727203906.70253: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:51:46 -0400 (0:00:02.059) 0:00:52.292 ***** 15896 1727203906.70328: entering _queue_task() for managed-node1/package_facts 15896 1727203906.70607: worker is 1 (out of 1 available) 15896 1727203906.70618: exiting _queue_task() for managed-node1/package_facts 15896 1727203906.70630: done queuing things up, now waiting for results queue to drain 15896 1727203906.70631: waiting for pending results... 15896 1727203906.70919: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15896 1727203906.71057: in run() - task 028d2410-947f-fb83-b6ad-00000000091a 15896 1727203906.71073: variable 'ansible_search_path' from source: unknown 15896 1727203906.71078: variable 'ansible_search_path' from source: unknown 15896 1727203906.71114: calling self._execute() 15896 1727203906.71344: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203906.71348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203906.71351: variable 'omit' from source: magic vars 15896 1727203906.71781: variable 'ansible_distribution_major_version' from source: facts 15896 1727203906.71785: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203906.71788: variable 'omit' from source: magic vars 15896 1727203906.71980: variable 'omit' from source: magic vars 15896 1727203906.71984: variable 'omit' from source: magic vars 15896 1727203906.71986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203906.71989: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203906.72003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203906.72024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203906.72035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203906.72066: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203906.72070: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203906.72083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203906.72176: Set connection var ansible_shell_type to sh 15896 1727203906.72222: Set connection var ansible_connection to ssh 15896 1727203906.72225: Set connection var ansible_shell_executable to /bin/sh 15896 1727203906.72227: Set connection var ansible_pipelining to False 15896 1727203906.72230: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203906.72232: Set connection var ansible_timeout to 10 15896 1727203906.72235: variable 'ansible_shell_executable' from source: unknown 15896 1727203906.72237: variable 'ansible_connection' from source: unknown 15896 1727203906.72239: variable 'ansible_module_compression' from source: unknown 15896 1727203906.72241: variable 'ansible_shell_type' from source: unknown 15896 1727203906.72243: variable 'ansible_shell_executable' from source: unknown 15896 1727203906.72245: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203906.72248: variable 'ansible_pipelining' from source: unknown 15896 1727203906.72249: variable 'ansible_timeout' from source: unknown 15896 1727203906.72252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203906.72585: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203906.72590: variable 'omit' from source: magic vars 15896 1727203906.72593: starting attempt loop 15896 1727203906.72595: running the handler 15896 1727203906.72597: _low_level_execute_command(): starting 15896 1727203906.72600: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203906.73262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203906.73266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203906.73268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203906.73271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203906.73274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203906.73278: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203906.73280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203906.73283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203906.73285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203906.73287: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203906.73289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203906.73291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203906.73293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203906.73295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203906.73370: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203906.73373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203906.73484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203906.75269: stdout chunk (state=3): >>>/root <<< 15896 1727203906.75367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203906.75399: stderr chunk (state=3): >>><<< 15896 1727203906.75402: stdout chunk (state=3): >>><<< 15896 1727203906.75424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203906.75435: _low_level_execute_command(): starting 15896 1727203906.75443: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770 `" && echo ansible-tmp-1727203906.7542338-19817-79265123774770="` echo /root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770 `" ) && sleep 0' 15896 1727203906.75918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203906.75921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203906.75924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203906.75928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203906.75931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203906.75982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203906.75985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203906.75997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203906.76080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203906.78210: stdout chunk (state=3): >>>ansible-tmp-1727203906.7542338-19817-79265123774770=/root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770 <<< 15896 1727203906.78326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203906.78336: stderr chunk (state=3): >>><<< 15896 1727203906.78350: stdout chunk (state=3): >>><<< 15896 1727203906.78372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203906.7542338-19817-79265123774770=/root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203906.78447: variable 'ansible_module_compression' from source: unknown 15896 1727203906.78503: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15896 1727203906.78569: variable 'ansible_facts' from source: unknown 15896 1727203906.78723: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/AnsiballZ_package_facts.py 15896 1727203906.78868: Sending initial data 15896 1727203906.78871: Sent initial data (161 bytes) 15896 1727203906.79526: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203906.79599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203906.79670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203906.79864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203906.81611: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15896 1727203906.81635: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203906.81714: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203906.81822: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmph9yjg5up /root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/AnsiballZ_package_facts.py <<< 15896 1727203906.81826: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/AnsiballZ_package_facts.py" <<< 15896 1727203906.81889: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmph9yjg5up" to remote "/root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/AnsiballZ_package_facts.py" <<< 15896 1727203906.83666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203906.83669: stdout chunk (state=3): >>><<< 15896 1727203906.83671: stderr chunk (state=3): >>><<< 15896 1727203906.83674: done transferring module to remote 15896 1727203906.83682: _low_level_execute_command(): starting 15896 1727203906.83685: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/ /root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/AnsiballZ_package_facts.py && sleep 0' 15896 1727203906.84247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203906.84269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203906.84287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203906.84307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203906.84348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203906.84394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203906.84486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203906.84509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203906.84606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203906.86616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203906.86626: stdout chunk (state=3): >>><<< 15896 1727203906.86637: stderr chunk (state=3): >>><<< 15896 1727203906.86665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203906.86685: _low_level_execute_command(): starting 15896 1727203906.86696: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/AnsiballZ_package_facts.py && sleep 0' 15896 1727203906.87371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203906.87395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203906.87410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203906.87437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203906.87456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203906.87561: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203906.87581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203906.87609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203906.87627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203906.87773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203907.34804: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 15896 1727203907.34855: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 15896 1727203907.34895: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source":<<< 15896 1727203907.34903: stdout chunk (state=3): >>> "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "<<< 15896 1727203907.34908: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el1<<< 15896 1727203907.34954: stdout chunk (state=3): >>>0", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 15896 1727203907.34966: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 15896 1727203907.34975: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 15896 1727203907.34997: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 15896 1727203907.35007: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cl<<< 15896 1727203907.35018: stdout chunk (state=3): >>>oud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15896 1727203907.36927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203907.36961: stderr chunk (state=3): >>><<< 15896 1727203907.36964: stdout chunk (state=3): >>><<< 15896 1727203907.37008: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203907.38315: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203907.38331: _low_level_execute_command(): starting 15896 1727203907.38336: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203906.7542338-19817-79265123774770/ > /dev/null 2>&1 && sleep 0' 15896 1727203907.38839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203907.38843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203907.38845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203907.38847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203907.38849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203907.38908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203907.38913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203907.38915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203907.38990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203907.40969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203907.40997: stderr chunk (state=3): >>><<< 15896 1727203907.41000: stdout chunk (state=3): >>><<< 15896 1727203907.41013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203907.41019: handler run complete 15896 1727203907.41519: variable 'ansible_facts' from source: unknown 15896 1727203907.41803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203907.42941: variable 'ansible_facts' from source: unknown 15896 1727203907.43380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203907.44129: attempt loop complete, returning result 15896 1727203907.44132: _execute() done 15896 1727203907.44134: dumping result to json 15896 1727203907.44405: done dumping result, returning 15896 1727203907.44409: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-fb83-b6ad-00000000091a] 15896 1727203907.44412: sending task result for task 028d2410-947f-fb83-b6ad-00000000091a 15896 1727203907.46944: done sending task result for task 028d2410-947f-fb83-b6ad-00000000091a 15896 1727203907.46947: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203907.47114: no more pending results, returning what we have 15896 1727203907.47117: results queue empty 15896 1727203907.47118: checking for any_errors_fatal 15896 1727203907.47155: done checking for any_errors_fatal 15896 1727203907.47157: checking for max_fail_percentage 15896 1727203907.47158: done checking for max_fail_percentage 15896 1727203907.47159: checking to see if all hosts have failed and the running result is not ok 15896 1727203907.47160: done checking to see if all hosts have failed 15896 1727203907.47161: getting the remaining hosts for this loop 15896 1727203907.47162: done getting the remaining hosts for this loop 15896 1727203907.47166: getting the next task for host managed-node1 15896 1727203907.47172: done getting next task for host managed-node1 15896 1727203907.47179: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203907.47184: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203907.47207: getting variables 15896 1727203907.47208: in VariableManager get_vars() 15896 1727203907.47251: Calling all_inventory to load vars for managed-node1 15896 1727203907.47255: Calling groups_inventory to load vars for managed-node1 15896 1727203907.47257: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203907.47266: Calling all_plugins_play to load vars for managed-node1 15896 1727203907.47269: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203907.47272: Calling groups_plugins_play to load vars for managed-node1 15896 1727203907.49733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203907.52254: done with get_vars() 15896 1727203907.52286: done getting variables 15896 1727203907.52542: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:51:47 -0400 (0:00:00.822) 0:00:53.115 ***** 15896 1727203907.52586: entering _queue_task() for managed-node1/debug 15896 1727203907.53055: worker is 1 (out of 1 available) 15896 1727203907.53067: exiting _queue_task() for managed-node1/debug 15896 1727203907.53082: done queuing things up, now waiting for results queue to drain 15896 1727203907.53083: waiting for pending results... 15896 1727203907.53520: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 15896 1727203907.53538: in run() - task 028d2410-947f-fb83-b6ad-00000000016d 15896 1727203907.53558: variable 'ansible_search_path' from source: unknown 15896 1727203907.53565: variable 'ansible_search_path' from source: unknown 15896 1727203907.53608: calling self._execute() 15896 1727203907.53716: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203907.53726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203907.53740: variable 'omit' from source: magic vars 15896 1727203907.54179: variable 'ansible_distribution_major_version' from source: facts 15896 1727203907.54197: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203907.54208: variable 'omit' from source: magic vars 15896 1727203907.54287: variable 'omit' from source: magic vars 15896 1727203907.54400: variable 'network_provider' from source: set_fact 15896 1727203907.54421: variable 'omit' from source: magic vars 15896 1727203907.54465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203907.54508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203907.54537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203907.54608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203907.54786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203907.54789: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203907.54791: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203907.54795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203907.55023: Set connection var ansible_shell_type to sh 15896 1727203907.55037: Set connection var ansible_connection to ssh 15896 1727203907.55129: Set connection var ansible_shell_executable to /bin/sh 15896 1727203907.55132: Set connection var ansible_pipelining to False 15896 1727203907.55134: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203907.55137: Set connection var ansible_timeout to 10 15896 1727203907.55239: variable 'ansible_shell_executable' from source: unknown 15896 1727203907.55243: variable 'ansible_connection' from source: unknown 15896 1727203907.55245: variable 'ansible_module_compression' from source: unknown 15896 1727203907.55247: variable 'ansible_shell_type' from source: unknown 15896 1727203907.55249: variable 'ansible_shell_executable' from source: unknown 15896 1727203907.55250: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203907.55252: variable 'ansible_pipelining' from source: unknown 15896 1727203907.55257: variable 'ansible_timeout' from source: unknown 15896 1727203907.55260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203907.55677: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203907.55681: variable 'omit' from source: magic vars 15896 1727203907.55683: starting attempt loop 15896 1727203907.55685: running the handler 15896 1727203907.55688: handler run complete 15896 1727203907.55689: attempt loop complete, returning result 15896 1727203907.55691: _execute() done 15896 1727203907.55693: dumping result to json 15896 1727203907.55695: done dumping result, returning 15896 1727203907.55697: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-fb83-b6ad-00000000016d] 15896 1727203907.55983: sending task result for task 028d2410-947f-fb83-b6ad-00000000016d 15896 1727203907.56055: done sending task result for task 028d2410-947f-fb83-b6ad-00000000016d 15896 1727203907.56059: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 15896 1727203907.56155: no more pending results, returning what we have 15896 1727203907.56159: results queue empty 15896 1727203907.56160: checking for any_errors_fatal 15896 1727203907.56171: done checking for any_errors_fatal 15896 1727203907.56171: checking for max_fail_percentage 15896 1727203907.56173: done checking for max_fail_percentage 15896 1727203907.56174: checking to see if all hosts have failed and the running result is not ok 15896 1727203907.56175: done checking to see if all hosts have failed 15896 1727203907.56178: getting the remaining hosts for this loop 15896 1727203907.56179: done getting the remaining hosts for this loop 15896 1727203907.56183: getting the next task for host managed-node1 15896 1727203907.56191: done getting next task for host managed-node1 15896 1727203907.56195: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203907.56200: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203907.56215: getting variables 15896 1727203907.56217: in VariableManager get_vars() 15896 1727203907.56270: Calling all_inventory to load vars for managed-node1 15896 1727203907.56273: Calling groups_inventory to load vars for managed-node1 15896 1727203907.56679: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203907.56689: Calling all_plugins_play to load vars for managed-node1 15896 1727203907.56693: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203907.56696: Calling groups_plugins_play to load vars for managed-node1 15896 1727203907.59249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203907.61029: done with get_vars() 15896 1727203907.61238: done getting variables 15896 1727203907.61573: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:51:47 -0400 (0:00:00.090) 0:00:53.205 ***** 15896 1727203907.61618: entering _queue_task() for managed-node1/fail 15896 1727203907.62340: worker is 1 (out of 1 available) 15896 1727203907.62352: exiting _queue_task() for managed-node1/fail 15896 1727203907.62366: done queuing things up, now waiting for results queue to drain 15896 1727203907.62368: waiting for pending results... 15896 1727203907.62978: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15896 1727203907.63003: in run() - task 028d2410-947f-fb83-b6ad-00000000016e 15896 1727203907.63026: variable 'ansible_search_path' from source: unknown 15896 1727203907.63034: variable 'ansible_search_path' from source: unknown 15896 1727203907.63077: calling self._execute() 15896 1727203907.63215: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203907.63228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203907.63245: variable 'omit' from source: magic vars 15896 1727203907.63635: variable 'ansible_distribution_major_version' from source: facts 15896 1727203907.63654: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203907.63781: variable 'network_state' from source: role '' defaults 15896 1727203907.63797: Evaluated conditional (network_state != {}): False 15896 1727203907.63806: when evaluation is False, skipping this task 15896 1727203907.63813: _execute() done 15896 1727203907.63820: dumping result to json 15896 1727203907.63859: done dumping result, returning 15896 1727203907.63863: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-fb83-b6ad-00000000016e] 15896 1727203907.63865: sending task result for task 028d2410-947f-fb83-b6ad-00000000016e skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203907.64013: no more pending results, returning what we have 15896 1727203907.64018: results queue empty 15896 1727203907.64019: checking for any_errors_fatal 15896 1727203907.64026: done checking for any_errors_fatal 15896 1727203907.64026: checking for max_fail_percentage 15896 1727203907.64028: done checking for max_fail_percentage 15896 1727203907.64030: checking to see if all hosts have failed and the running result is not ok 15896 1727203907.64030: done checking to see if all hosts have failed 15896 1727203907.64031: getting the remaining hosts for this loop 15896 1727203907.64033: done getting the remaining hosts for this loop 15896 1727203907.64037: getting the next task for host managed-node1 15896 1727203907.64045: done getting next task for host managed-node1 15896 1727203907.64049: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203907.64054: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203907.64080: getting variables 15896 1727203907.64082: in VariableManager get_vars() 15896 1727203907.64134: Calling all_inventory to load vars for managed-node1 15896 1727203907.64136: Calling groups_inventory to load vars for managed-node1 15896 1727203907.64139: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203907.64151: Calling all_plugins_play to load vars for managed-node1 15896 1727203907.64155: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203907.64158: Calling groups_plugins_play to load vars for managed-node1 15896 1727203907.64989: done sending task result for task 028d2410-947f-fb83-b6ad-00000000016e 15896 1727203907.64993: WORKER PROCESS EXITING 15896 1727203907.66896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203907.67947: done with get_vars() 15896 1727203907.67964: done getting variables 15896 1727203907.68026: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:51:47 -0400 (0:00:00.064) 0:00:53.269 ***** 15896 1727203907.68060: entering _queue_task() for managed-node1/fail 15896 1727203907.68388: worker is 1 (out of 1 available) 15896 1727203907.68399: exiting _queue_task() for managed-node1/fail 15896 1727203907.68412: done queuing things up, now waiting for results queue to drain 15896 1727203907.68413: waiting for pending results... 15896 1727203907.68742: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15896 1727203907.68854: in run() - task 028d2410-947f-fb83-b6ad-00000000016f 15896 1727203907.68861: variable 'ansible_search_path' from source: unknown 15896 1727203907.68865: variable 'ansible_search_path' from source: unknown 15896 1727203907.68898: calling self._execute() 15896 1727203907.69190: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203907.69194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203907.69198: variable 'omit' from source: magic vars 15896 1727203907.69384: variable 'ansible_distribution_major_version' from source: facts 15896 1727203907.69396: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203907.69520: variable 'network_state' from source: role '' defaults 15896 1727203907.69529: Evaluated conditional (network_state != {}): False 15896 1727203907.69533: when evaluation is False, skipping this task 15896 1727203907.69535: _execute() done 15896 1727203907.69538: dumping result to json 15896 1727203907.69540: done dumping result, returning 15896 1727203907.69549: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-fb83-b6ad-00000000016f] 15896 1727203907.69554: sending task result for task 028d2410-947f-fb83-b6ad-00000000016f skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203907.69897: no more pending results, returning what we have 15896 1727203907.69901: results queue empty 15896 1727203907.69902: checking for any_errors_fatal 15896 1727203907.69910: done checking for any_errors_fatal 15896 1727203907.69911: checking for max_fail_percentage 15896 1727203907.69913: done checking for max_fail_percentage 15896 1727203907.69914: checking to see if all hosts have failed and the running result is not ok 15896 1727203907.69915: done checking to see if all hosts have failed 15896 1727203907.69915: getting the remaining hosts for this loop 15896 1727203907.69917: done getting the remaining hosts for this loop 15896 1727203907.69920: getting the next task for host managed-node1 15896 1727203907.69926: done getting next task for host managed-node1 15896 1727203907.69931: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203907.69935: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203907.70018: getting variables 15896 1727203907.70021: in VariableManager get_vars() 15896 1727203907.70147: Calling all_inventory to load vars for managed-node1 15896 1727203907.70150: Calling groups_inventory to load vars for managed-node1 15896 1727203907.70152: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203907.70157: done sending task result for task 028d2410-947f-fb83-b6ad-00000000016f 15896 1727203907.70163: WORKER PROCESS EXITING 15896 1727203907.70181: Calling all_plugins_play to load vars for managed-node1 15896 1727203907.70184: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203907.70188: Calling groups_plugins_play to load vars for managed-node1 15896 1727203907.72268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203907.73169: done with get_vars() 15896 1727203907.73189: done getting variables 15896 1727203907.73234: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:51:47 -0400 (0:00:00.051) 0:00:53.321 ***** 15896 1727203907.73262: entering _queue_task() for managed-node1/fail 15896 1727203907.73524: worker is 1 (out of 1 available) 15896 1727203907.73538: exiting _queue_task() for managed-node1/fail 15896 1727203907.73549: done queuing things up, now waiting for results queue to drain 15896 1727203907.73551: waiting for pending results... 15896 1727203907.73745: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15896 1727203907.73877: in run() - task 028d2410-947f-fb83-b6ad-000000000170 15896 1727203907.73881: variable 'ansible_search_path' from source: unknown 15896 1727203907.73885: variable 'ansible_search_path' from source: unknown 15896 1727203907.73974: calling self._execute() 15896 1727203907.74012: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203907.74016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203907.74082: variable 'omit' from source: magic vars 15896 1727203907.75105: variable 'ansible_distribution_major_version' from source: facts 15896 1727203907.75322: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203907.75843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203907.88115: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203907.88191: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203907.88227: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203907.88262: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203907.88281: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203907.88364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203907.88390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203907.88429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203907.88468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203907.88503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203907.88587: variable 'ansible_distribution_major_version' from source: facts 15896 1727203907.88599: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15896 1727203907.88749: variable 'ansible_distribution' from source: facts 15896 1727203907.88753: variable '__network_rh_distros' from source: role '' defaults 15896 1727203907.88755: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15896 1727203907.89034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203907.89071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203907.89136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203907.89140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203907.89143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203907.89204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203907.89226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203907.89248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203907.89329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203907.89333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203907.89403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203907.89422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203907.89441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203907.89521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203907.89697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203907.90016: variable 'network_connections' from source: task vars 15896 1727203907.90027: variable 'controller_profile' from source: play vars 15896 1727203907.90090: variable 'controller_profile' from source: play vars 15896 1727203907.90099: variable 'network_state' from source: role '' defaults 15896 1727203907.90280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203907.90403: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203907.90425: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203907.90473: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203907.90517: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203907.90585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203907.90588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203907.90621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203907.90637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203907.90663: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15896 1727203907.90670: when evaluation is False, skipping this task 15896 1727203907.90673: _execute() done 15896 1727203907.90678: dumping result to json 15896 1727203907.90681: done dumping result, returning 15896 1727203907.90690: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-fb83-b6ad-000000000170] 15896 1727203907.90693: sending task result for task 028d2410-947f-fb83-b6ad-000000000170 skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15896 1727203907.90898: no more pending results, returning what we have 15896 1727203907.90904: results queue empty 15896 1727203907.90906: checking for any_errors_fatal 15896 1727203907.90912: done checking for any_errors_fatal 15896 1727203907.90913: checking for max_fail_percentage 15896 1727203907.90915: done checking for max_fail_percentage 15896 1727203907.90916: checking to see if all hosts have failed and the running result is not ok 15896 1727203907.90917: done checking to see if all hosts have failed 15896 1727203907.90917: getting the remaining hosts for this loop 15896 1727203907.90919: done getting the remaining hosts for this loop 15896 1727203907.90923: getting the next task for host managed-node1 15896 1727203907.90930: done getting next task for host managed-node1 15896 1727203907.90939: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203907.90943: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203907.90969: getting variables 15896 1727203907.90971: in VariableManager get_vars() 15896 1727203907.91146: Calling all_inventory to load vars for managed-node1 15896 1727203907.91149: Calling groups_inventory to load vars for managed-node1 15896 1727203907.91151: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203907.91161: done sending task result for task 028d2410-947f-fb83-b6ad-000000000170 15896 1727203907.91165: WORKER PROCESS EXITING 15896 1727203907.91279: Calling all_plugins_play to load vars for managed-node1 15896 1727203907.91284: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203907.91288: Calling groups_plugins_play to load vars for managed-node1 15896 1727203908.06822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203908.10395: done with get_vars() 15896 1727203908.10431: done getting variables 15896 1727203908.10483: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:51:48 -0400 (0:00:00.372) 0:00:53.694 ***** 15896 1727203908.10519: entering _queue_task() for managed-node1/dnf 15896 1727203908.11096: worker is 1 (out of 1 available) 15896 1727203908.11106: exiting _queue_task() for managed-node1/dnf 15896 1727203908.11117: done queuing things up, now waiting for results queue to drain 15896 1727203908.11119: waiting for pending results... 15896 1727203908.11401: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15896 1727203908.11550: in run() - task 028d2410-947f-fb83-b6ad-000000000171 15896 1727203908.11555: variable 'ansible_search_path' from source: unknown 15896 1727203908.11561: variable 'ansible_search_path' from source: unknown 15896 1727203908.11588: calling self._execute() 15896 1727203908.11908: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203908.11911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203908.11914: variable 'omit' from source: magic vars 15896 1727203908.12129: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.12141: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203908.12544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203908.16127: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203908.16204: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203908.16235: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203908.16268: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203908.16373: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203908.16379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.16401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.16424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.16459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.16482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.16591: variable 'ansible_distribution' from source: facts 15896 1727203908.16594: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.16613: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15896 1727203908.16721: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203908.16849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.16873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.16906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.16954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.16972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.17008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.17036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.17179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.17183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.17185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.17187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.17190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.17208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.17235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.17294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.17561: variable 'network_connections' from source: task vars 15896 1727203908.17571: variable 'controller_profile' from source: play vars 15896 1727203908.17574: variable 'controller_profile' from source: play vars 15896 1727203908.17579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203908.17742: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203908.17791: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203908.17814: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203908.17885: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203908.17888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203908.17910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203908.17935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.17959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203908.18008: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203908.18260: variable 'network_connections' from source: task vars 15896 1727203908.18263: variable 'controller_profile' from source: play vars 15896 1727203908.18323: variable 'controller_profile' from source: play vars 15896 1727203908.18365: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203908.18369: when evaluation is False, skipping this task 15896 1727203908.18371: _execute() done 15896 1727203908.18374: dumping result to json 15896 1727203908.18377: done dumping result, returning 15896 1727203908.18380: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000171] 15896 1727203908.18382: sending task result for task 028d2410-947f-fb83-b6ad-000000000171 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203908.18531: no more pending results, returning what we have 15896 1727203908.18534: results queue empty 15896 1727203908.18535: checking for any_errors_fatal 15896 1727203908.18545: done checking for any_errors_fatal 15896 1727203908.18546: checking for max_fail_percentage 15896 1727203908.18548: done checking for max_fail_percentage 15896 1727203908.18549: checking to see if all hosts have failed and the running result is not ok 15896 1727203908.18550: done checking to see if all hosts have failed 15896 1727203908.18550: getting the remaining hosts for this loop 15896 1727203908.18552: done getting the remaining hosts for this loop 15896 1727203908.18556: getting the next task for host managed-node1 15896 1727203908.18563: done getting next task for host managed-node1 15896 1727203908.18568: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203908.18571: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203908.18594: getting variables 15896 1727203908.18597: in VariableManager get_vars() 15896 1727203908.18649: Calling all_inventory to load vars for managed-node1 15896 1727203908.18652: Calling groups_inventory to load vars for managed-node1 15896 1727203908.18655: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203908.18668: Calling all_plugins_play to load vars for managed-node1 15896 1727203908.18671: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203908.18674: Calling groups_plugins_play to load vars for managed-node1 15896 1727203908.19221: done sending task result for task 028d2410-947f-fb83-b6ad-000000000171 15896 1727203908.19226: WORKER PROCESS EXITING 15896 1727203908.20016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203908.21620: done with get_vars() 15896 1727203908.21641: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15896 1727203908.21722: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:51:48 -0400 (0:00:00.112) 0:00:53.806 ***** 15896 1727203908.21753: entering _queue_task() for managed-node1/yum 15896 1727203908.22099: worker is 1 (out of 1 available) 15896 1727203908.22113: exiting _queue_task() for managed-node1/yum 15896 1727203908.22125: done queuing things up, now waiting for results queue to drain 15896 1727203908.22127: waiting for pending results... 15896 1727203908.23113: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15896 1727203908.23118: in run() - task 028d2410-947f-fb83-b6ad-000000000172 15896 1727203908.23136: variable 'ansible_search_path' from source: unknown 15896 1727203908.23139: variable 'ansible_search_path' from source: unknown 15896 1727203908.23179: calling self._execute() 15896 1727203908.23418: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203908.23422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203908.23425: variable 'omit' from source: magic vars 15896 1727203908.23704: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.23716: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203908.23966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203908.28378: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203908.28633: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203908.28671: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203908.28704: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203908.28844: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203908.28924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.29067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.29096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.29136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.29150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.29368: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.29490: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15896 1727203908.29493: when evaluation is False, skipping this task 15896 1727203908.29496: _execute() done 15896 1727203908.29499: dumping result to json 15896 1727203908.29502: done dumping result, returning 15896 1727203908.29511: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000172] 15896 1727203908.29514: sending task result for task 028d2410-947f-fb83-b6ad-000000000172 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15896 1727203908.29769: no more pending results, returning what we have 15896 1727203908.29773: results queue empty 15896 1727203908.29774: checking for any_errors_fatal 15896 1727203908.29782: done checking for any_errors_fatal 15896 1727203908.29783: checking for max_fail_percentage 15896 1727203908.29785: done checking for max_fail_percentage 15896 1727203908.29786: checking to see if all hosts have failed and the running result is not ok 15896 1727203908.29787: done checking to see if all hosts have failed 15896 1727203908.29788: getting the remaining hosts for this loop 15896 1727203908.29789: done getting the remaining hosts for this loop 15896 1727203908.29793: getting the next task for host managed-node1 15896 1727203908.29802: done getting next task for host managed-node1 15896 1727203908.29807: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203908.29810: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203908.29834: getting variables 15896 1727203908.29836: in VariableManager get_vars() 15896 1727203908.30198: Calling all_inventory to load vars for managed-node1 15896 1727203908.30201: Calling groups_inventory to load vars for managed-node1 15896 1727203908.30204: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203908.30214: Calling all_plugins_play to load vars for managed-node1 15896 1727203908.30217: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203908.30220: Calling groups_plugins_play to load vars for managed-node1 15896 1727203908.30990: done sending task result for task 028d2410-947f-fb83-b6ad-000000000172 15896 1727203908.30994: WORKER PROCESS EXITING 15896 1727203908.33486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203908.37361: done with get_vars() 15896 1727203908.37492: done getting variables 15896 1727203908.37554: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:51:48 -0400 (0:00:00.161) 0:00:53.968 ***** 15896 1727203908.37899: entering _queue_task() for managed-node1/fail 15896 1727203908.38460: worker is 1 (out of 1 available) 15896 1727203908.38471: exiting _queue_task() for managed-node1/fail 15896 1727203908.38986: done queuing things up, now waiting for results queue to drain 15896 1727203908.38988: waiting for pending results... 15896 1727203908.39422: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15896 1727203908.39805: in run() - task 028d2410-947f-fb83-b6ad-000000000173 15896 1727203908.39817: variable 'ansible_search_path' from source: unknown 15896 1727203908.40072: variable 'ansible_search_path' from source: unknown 15896 1727203908.40109: calling self._execute() 15896 1727203908.40488: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203908.40492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203908.40628: variable 'omit' from source: magic vars 15896 1727203908.41582: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.41708: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203908.41942: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203908.42529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203908.51484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203908.51718: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203908.51772: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203908.51928: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203908.51996: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203908.52177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.52182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.52214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.52305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.52320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.52525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.52548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.52625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.52672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.52771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.52774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.52779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.53007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.53085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.53088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.53509: variable 'network_connections' from source: task vars 15896 1727203908.53523: variable 'controller_profile' from source: play vars 15896 1727203908.53689: variable 'controller_profile' from source: play vars 15896 1727203908.53737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203908.54099: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203908.54579: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203908.54623: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203908.54670: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203908.54726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203908.54764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203908.54803: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.54913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203908.54974: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203908.55308: variable 'network_connections' from source: task vars 15896 1727203908.55311: variable 'controller_profile' from source: play vars 15896 1727203908.55379: variable 'controller_profile' from source: play vars 15896 1727203908.55409: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203908.55581: when evaluation is False, skipping this task 15896 1727203908.55584: _execute() done 15896 1727203908.55588: dumping result to json 15896 1727203908.55591: done dumping result, returning 15896 1727203908.55596: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000173] 15896 1727203908.55599: sending task result for task 028d2410-947f-fb83-b6ad-000000000173 15896 1727203908.55696: done sending task result for task 028d2410-947f-fb83-b6ad-000000000173 15896 1727203908.55698: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203908.55760: no more pending results, returning what we have 15896 1727203908.55764: results queue empty 15896 1727203908.55765: checking for any_errors_fatal 15896 1727203908.55772: done checking for any_errors_fatal 15896 1727203908.55773: checking for max_fail_percentage 15896 1727203908.55776: done checking for max_fail_percentage 15896 1727203908.55778: checking to see if all hosts have failed and the running result is not ok 15896 1727203908.55778: done checking to see if all hosts have failed 15896 1727203908.55779: getting the remaining hosts for this loop 15896 1727203908.55781: done getting the remaining hosts for this loop 15896 1727203908.55784: getting the next task for host managed-node1 15896 1727203908.55792: done getting next task for host managed-node1 15896 1727203908.55796: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15896 1727203908.55799: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203908.55825: getting variables 15896 1727203908.55827: in VariableManager get_vars() 15896 1727203908.55964: Calling all_inventory to load vars for managed-node1 15896 1727203908.55968: Calling groups_inventory to load vars for managed-node1 15896 1727203908.55974: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203908.55991: Calling all_plugins_play to load vars for managed-node1 15896 1727203908.55995: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203908.55999: Calling groups_plugins_play to load vars for managed-node1 15896 1727203908.58020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203908.59799: done with get_vars() 15896 1727203908.59819: done getting variables 15896 1727203908.59873: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:51:48 -0400 (0:00:00.220) 0:00:54.188 ***** 15896 1727203908.59908: entering _queue_task() for managed-node1/package 15896 1727203908.60244: worker is 1 (out of 1 available) 15896 1727203908.60256: exiting _queue_task() for managed-node1/package 15896 1727203908.60269: done queuing things up, now waiting for results queue to drain 15896 1727203908.60270: waiting for pending results... 15896 1727203908.60551: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 15896 1727203908.60659: in run() - task 028d2410-947f-fb83-b6ad-000000000174 15896 1727203908.60700: variable 'ansible_search_path' from source: unknown 15896 1727203908.60716: variable 'ansible_search_path' from source: unknown 15896 1727203908.60724: calling self._execute() 15896 1727203908.60821: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203908.60825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203908.60839: variable 'omit' from source: magic vars 15896 1727203908.61387: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.61391: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203908.61465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203908.62069: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203908.62236: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203908.62266: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203908.62481: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203908.62664: variable 'network_packages' from source: role '' defaults 15896 1727203908.62779: variable '__network_provider_setup' from source: role '' defaults 15896 1727203908.62808: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203908.62855: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203908.62869: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203908.62912: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203908.63054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203908.64814: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203908.64863: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203908.64936: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203908.64965: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203908.64978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203908.65053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.65074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.65094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.65122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.65136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.65166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.65184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.65200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.65226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.65238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.65685: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203908.65924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.65952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.65989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.66035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.66087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.66353: variable 'ansible_python' from source: facts 15896 1727203908.66394: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203908.66478: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203908.66536: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203908.66631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.66647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.66674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.66702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.66713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.66745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.66764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.66784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.66808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.66820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.66924: variable 'network_connections' from source: task vars 15896 1727203908.66928: variable 'controller_profile' from source: play vars 15896 1727203908.67000: variable 'controller_profile' from source: play vars 15896 1727203908.67055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203908.67081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203908.67103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.67125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203908.67164: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203908.67346: variable 'network_connections' from source: task vars 15896 1727203908.67349: variable 'controller_profile' from source: play vars 15896 1727203908.67420: variable 'controller_profile' from source: play vars 15896 1727203908.67445: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203908.67502: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203908.67691: variable 'network_connections' from source: task vars 15896 1727203908.67696: variable 'controller_profile' from source: play vars 15896 1727203908.67740: variable 'controller_profile' from source: play vars 15896 1727203908.67758: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203908.67813: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203908.68005: variable 'network_connections' from source: task vars 15896 1727203908.68009: variable 'controller_profile' from source: play vars 15896 1727203908.68055: variable 'controller_profile' from source: play vars 15896 1727203908.68094: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203908.68135: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203908.68139: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203908.68184: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203908.68481: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203908.69181: variable 'network_connections' from source: task vars 15896 1727203908.69184: variable 'controller_profile' from source: play vars 15896 1727203908.69186: variable 'controller_profile' from source: play vars 15896 1727203908.69189: variable 'ansible_distribution' from source: facts 15896 1727203908.69191: variable '__network_rh_distros' from source: role '' defaults 15896 1727203908.69193: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.69195: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203908.69255: variable 'ansible_distribution' from source: facts 15896 1727203908.69261: variable '__network_rh_distros' from source: role '' defaults 15896 1727203908.69264: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.69278: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203908.69535: variable 'ansible_distribution' from source: facts 15896 1727203908.69538: variable '__network_rh_distros' from source: role '' defaults 15896 1727203908.69540: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.69542: variable 'network_provider' from source: set_fact 15896 1727203908.69544: variable 'ansible_facts' from source: unknown 15896 1727203908.70305: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15896 1727203908.70309: when evaluation is False, skipping this task 15896 1727203908.70311: _execute() done 15896 1727203908.70314: dumping result to json 15896 1727203908.70316: done dumping result, returning 15896 1727203908.70325: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-fb83-b6ad-000000000174] 15896 1727203908.70331: sending task result for task 028d2410-947f-fb83-b6ad-000000000174 15896 1727203908.70524: done sending task result for task 028d2410-947f-fb83-b6ad-000000000174 15896 1727203908.70527: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15896 1727203908.70578: no more pending results, returning what we have 15896 1727203908.70581: results queue empty 15896 1727203908.70582: checking for any_errors_fatal 15896 1727203908.70588: done checking for any_errors_fatal 15896 1727203908.70589: checking for max_fail_percentage 15896 1727203908.70591: done checking for max_fail_percentage 15896 1727203908.70592: checking to see if all hosts have failed and the running result is not ok 15896 1727203908.70593: done checking to see if all hosts have failed 15896 1727203908.70594: getting the remaining hosts for this loop 15896 1727203908.70596: done getting the remaining hosts for this loop 15896 1727203908.70600: getting the next task for host managed-node1 15896 1727203908.70606: done getting next task for host managed-node1 15896 1727203908.70614: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203908.70619: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203908.70640: getting variables 15896 1727203908.70641: in VariableManager get_vars() 15896 1727203908.70749: Calling all_inventory to load vars for managed-node1 15896 1727203908.70752: Calling groups_inventory to load vars for managed-node1 15896 1727203908.70754: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203908.70772: Calling all_plugins_play to load vars for managed-node1 15896 1727203908.70776: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203908.70835: Calling groups_plugins_play to load vars for managed-node1 15896 1727203908.72379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203908.73411: done with get_vars() 15896 1727203908.73435: done getting variables 15896 1727203908.73509: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:51:48 -0400 (0:00:00.136) 0:00:54.324 ***** 15896 1727203908.73547: entering _queue_task() for managed-node1/package 15896 1727203908.73922: worker is 1 (out of 1 available) 15896 1727203908.73968: exiting _queue_task() for managed-node1/package 15896 1727203908.73983: done queuing things up, now waiting for results queue to drain 15896 1727203908.73985: waiting for pending results... 15896 1727203908.74228: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15896 1727203908.74322: in run() - task 028d2410-947f-fb83-b6ad-000000000175 15896 1727203908.74333: variable 'ansible_search_path' from source: unknown 15896 1727203908.74336: variable 'ansible_search_path' from source: unknown 15896 1727203908.74367: calling self._execute() 15896 1727203908.74462: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203908.74470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203908.74481: variable 'omit' from source: magic vars 15896 1727203908.74764: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.74773: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203908.74854: variable 'network_state' from source: role '' defaults 15896 1727203908.74868: Evaluated conditional (network_state != {}): False 15896 1727203908.74871: when evaluation is False, skipping this task 15896 1727203908.74873: _execute() done 15896 1727203908.74877: dumping result to json 15896 1727203908.74880: done dumping result, returning 15896 1727203908.74889: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-fb83-b6ad-000000000175] 15896 1727203908.74894: sending task result for task 028d2410-947f-fb83-b6ad-000000000175 15896 1727203908.74983: done sending task result for task 028d2410-947f-fb83-b6ad-000000000175 15896 1727203908.74986: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203908.75029: no more pending results, returning what we have 15896 1727203908.75032: results queue empty 15896 1727203908.75033: checking for any_errors_fatal 15896 1727203908.75041: done checking for any_errors_fatal 15896 1727203908.75042: checking for max_fail_percentage 15896 1727203908.75043: done checking for max_fail_percentage 15896 1727203908.75045: checking to see if all hosts have failed and the running result is not ok 15896 1727203908.75045: done checking to see if all hosts have failed 15896 1727203908.75046: getting the remaining hosts for this loop 15896 1727203908.75047: done getting the remaining hosts for this loop 15896 1727203908.75051: getting the next task for host managed-node1 15896 1727203908.75059: done getting next task for host managed-node1 15896 1727203908.75063: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203908.75067: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203908.75098: getting variables 15896 1727203908.75100: in VariableManager get_vars() 15896 1727203908.75146: Calling all_inventory to load vars for managed-node1 15896 1727203908.75148: Calling groups_inventory to load vars for managed-node1 15896 1727203908.75150: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203908.75159: Calling all_plugins_play to load vars for managed-node1 15896 1727203908.75162: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203908.75165: Calling groups_plugins_play to load vars for managed-node1 15896 1727203908.76158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203908.77566: done with get_vars() 15896 1727203908.77587: done getting variables 15896 1727203908.77628: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:51:48 -0400 (0:00:00.041) 0:00:54.365 ***** 15896 1727203908.77656: entering _queue_task() for managed-node1/package 15896 1727203908.77901: worker is 1 (out of 1 available) 15896 1727203908.77914: exiting _queue_task() for managed-node1/package 15896 1727203908.77925: done queuing things up, now waiting for results queue to drain 15896 1727203908.77927: waiting for pending results... 15896 1727203908.78112: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15896 1727203908.78214: in run() - task 028d2410-947f-fb83-b6ad-000000000176 15896 1727203908.78225: variable 'ansible_search_path' from source: unknown 15896 1727203908.78228: variable 'ansible_search_path' from source: unknown 15896 1727203908.78258: calling self._execute() 15896 1727203908.78342: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203908.78346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203908.78355: variable 'omit' from source: magic vars 15896 1727203908.78636: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.78645: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203908.78773: variable 'network_state' from source: role '' defaults 15896 1727203908.78778: Evaluated conditional (network_state != {}): False 15896 1727203908.78783: when evaluation is False, skipping this task 15896 1727203908.78785: _execute() done 15896 1727203908.78788: dumping result to json 15896 1727203908.78790: done dumping result, returning 15896 1727203908.78792: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-fb83-b6ad-000000000176] 15896 1727203908.78795: sending task result for task 028d2410-947f-fb83-b6ad-000000000176 15896 1727203908.78925: done sending task result for task 028d2410-947f-fb83-b6ad-000000000176 15896 1727203908.78927: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203908.78970: no more pending results, returning what we have 15896 1727203908.78974: results queue empty 15896 1727203908.78975: checking for any_errors_fatal 15896 1727203908.78985: done checking for any_errors_fatal 15896 1727203908.78986: checking for max_fail_percentage 15896 1727203908.78988: done checking for max_fail_percentage 15896 1727203908.78989: checking to see if all hosts have failed and the running result is not ok 15896 1727203908.78990: done checking to see if all hosts have failed 15896 1727203908.78990: getting the remaining hosts for this loop 15896 1727203908.78992: done getting the remaining hosts for this loop 15896 1727203908.78995: getting the next task for host managed-node1 15896 1727203908.79002: done getting next task for host managed-node1 15896 1727203908.79006: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203908.79010: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203908.79033: getting variables 15896 1727203908.79034: in VariableManager get_vars() 15896 1727203908.79080: Calling all_inventory to load vars for managed-node1 15896 1727203908.79107: Calling groups_inventory to load vars for managed-node1 15896 1727203908.79111: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203908.79120: Calling all_plugins_play to load vars for managed-node1 15896 1727203908.79123: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203908.79126: Calling groups_plugins_play to load vars for managed-node1 15896 1727203908.80420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203908.81568: done with get_vars() 15896 1727203908.81590: done getting variables 15896 1727203908.81633: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:51:48 -0400 (0:00:00.040) 0:00:54.405 ***** 15896 1727203908.81663: entering _queue_task() for managed-node1/service 15896 1727203908.81920: worker is 1 (out of 1 available) 15896 1727203908.81933: exiting _queue_task() for managed-node1/service 15896 1727203908.81945: done queuing things up, now waiting for results queue to drain 15896 1727203908.81946: waiting for pending results... 15896 1727203908.82141: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15896 1727203908.82241: in run() - task 028d2410-947f-fb83-b6ad-000000000177 15896 1727203908.82251: variable 'ansible_search_path' from source: unknown 15896 1727203908.82255: variable 'ansible_search_path' from source: unknown 15896 1727203908.82287: calling self._execute() 15896 1727203908.82366: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203908.82369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203908.82380: variable 'omit' from source: magic vars 15896 1727203908.82713: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.82717: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203908.82829: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203908.83040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203908.84955: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203908.85012: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203908.85039: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203908.85067: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203908.85089: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203908.85151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.85175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.85194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.85224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.85235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.85269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.85287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.85303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.85332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.85343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.85373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.85391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.85407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.85434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.85444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.85558: variable 'network_connections' from source: task vars 15896 1727203908.85570: variable 'controller_profile' from source: play vars 15896 1727203908.85620: variable 'controller_profile' from source: play vars 15896 1727203908.85677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203908.85834: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203908.85862: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203908.85914: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203908.85961: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203908.86047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203908.86050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203908.86100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.86187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203908.86297: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203908.86511: variable 'network_connections' from source: task vars 15896 1727203908.86514: variable 'controller_profile' from source: play vars 15896 1727203908.86577: variable 'controller_profile' from source: play vars 15896 1727203908.86593: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15896 1727203908.86597: when evaluation is False, skipping this task 15896 1727203908.86604: _execute() done 15896 1727203908.86607: dumping result to json 15896 1727203908.86614: done dumping result, returning 15896 1727203908.86616: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-fb83-b6ad-000000000177] 15896 1727203908.86621: sending task result for task 028d2410-947f-fb83-b6ad-000000000177 15896 1727203908.86747: done sending task result for task 028d2410-947f-fb83-b6ad-000000000177 15896 1727203908.86755: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15896 1727203908.86806: no more pending results, returning what we have 15896 1727203908.86809: results queue empty 15896 1727203908.86810: checking for any_errors_fatal 15896 1727203908.86816: done checking for any_errors_fatal 15896 1727203908.86817: checking for max_fail_percentage 15896 1727203908.86818: done checking for max_fail_percentage 15896 1727203908.86819: checking to see if all hosts have failed and the running result is not ok 15896 1727203908.86820: done checking to see if all hosts have failed 15896 1727203908.86821: getting the remaining hosts for this loop 15896 1727203908.86822: done getting the remaining hosts for this loop 15896 1727203908.86826: getting the next task for host managed-node1 15896 1727203908.86833: done getting next task for host managed-node1 15896 1727203908.86836: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203908.86840: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203908.86863: getting variables 15896 1727203908.86865: in VariableManager get_vars() 15896 1727203908.86919: Calling all_inventory to load vars for managed-node1 15896 1727203908.86922: Calling groups_inventory to load vars for managed-node1 15896 1727203908.86924: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203908.86937: Calling all_plugins_play to load vars for managed-node1 15896 1727203908.86940: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203908.86945: Calling groups_plugins_play to load vars for managed-node1 15896 1727203908.88514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203908.89794: done with get_vars() 15896 1727203908.89812: done getting variables 15896 1727203908.89867: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:51:48 -0400 (0:00:00.082) 0:00:54.488 ***** 15896 1727203908.89893: entering _queue_task() for managed-node1/service 15896 1727203908.90209: worker is 1 (out of 1 available) 15896 1727203908.90221: exiting _queue_task() for managed-node1/service 15896 1727203908.90232: done queuing things up, now waiting for results queue to drain 15896 1727203908.90233: waiting for pending results... 15896 1727203908.90444: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15896 1727203908.90589: in run() - task 028d2410-947f-fb83-b6ad-000000000178 15896 1727203908.90614: variable 'ansible_search_path' from source: unknown 15896 1727203908.90617: variable 'ansible_search_path' from source: unknown 15896 1727203908.90666: calling self._execute() 15896 1727203908.90779: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203908.90783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203908.90799: variable 'omit' from source: magic vars 15896 1727203908.91184: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.91188: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203908.91340: variable 'network_provider' from source: set_fact 15896 1727203908.91344: variable 'network_state' from source: role '' defaults 15896 1727203908.91347: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15896 1727203908.91349: variable 'omit' from source: magic vars 15896 1727203908.91391: variable 'omit' from source: magic vars 15896 1727203908.91422: variable 'network_service_name' from source: role '' defaults 15896 1727203908.91489: variable 'network_service_name' from source: role '' defaults 15896 1727203908.91644: variable '__network_provider_setup' from source: role '' defaults 15896 1727203908.91782: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203908.91786: variable '__network_service_name_default_nm' from source: role '' defaults 15896 1727203908.91789: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203908.91813: variable '__network_packages_default_nm' from source: role '' defaults 15896 1727203908.92067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203908.95663: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203908.95791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203908.95818: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203908.95846: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203908.95894: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203908.95939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.95986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.96021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.96048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.96061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.96094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.96112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.96146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.96171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.96183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.96348: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15896 1727203908.96428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.96445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.96465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.96492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.96502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.96568: variable 'ansible_python' from source: facts 15896 1727203908.96584: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15896 1727203908.96678: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203908.96712: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203908.96816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.96828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.96850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.97080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.97084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.97086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203908.97095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203908.97098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.97100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203908.97102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203908.97131: variable 'network_connections' from source: task vars 15896 1727203908.97139: variable 'controller_profile' from source: play vars 15896 1727203908.97195: variable 'controller_profile' from source: play vars 15896 1727203908.97268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203908.97463: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203908.97520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203908.97572: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203908.97616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203908.97665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203908.97687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203908.97709: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203908.97738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203908.97889: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203908.98043: variable 'network_connections' from source: task vars 15896 1727203908.98049: variable 'controller_profile' from source: play vars 15896 1727203908.98116: variable 'controller_profile' from source: play vars 15896 1727203908.98143: variable '__network_packages_default_wireless' from source: role '' defaults 15896 1727203908.98213: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203908.98520: variable 'network_connections' from source: task vars 15896 1727203908.98532: variable 'controller_profile' from source: play vars 15896 1727203908.98581: variable 'controller_profile' from source: play vars 15896 1727203908.98598: variable '__network_packages_default_team' from source: role '' defaults 15896 1727203908.98652: variable '__network_team_connections_defined' from source: role '' defaults 15896 1727203908.98845: variable 'network_connections' from source: task vars 15896 1727203908.98848: variable 'controller_profile' from source: play vars 15896 1727203908.98908: variable 'controller_profile' from source: play vars 15896 1727203908.98947: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203908.99015: variable '__network_service_name_default_initscripts' from source: role '' defaults 15896 1727203908.99018: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203908.99074: variable '__network_packages_default_initscripts' from source: role '' defaults 15896 1727203908.99246: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15896 1727203908.99743: variable 'network_connections' from source: task vars 15896 1727203908.99747: variable 'controller_profile' from source: play vars 15896 1727203908.99803: variable 'controller_profile' from source: play vars 15896 1727203908.99809: variable 'ansible_distribution' from source: facts 15896 1727203908.99822: variable '__network_rh_distros' from source: role '' defaults 15896 1727203908.99829: variable 'ansible_distribution_major_version' from source: facts 15896 1727203908.99857: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15896 1727203909.00006: variable 'ansible_distribution' from source: facts 15896 1727203909.00009: variable '__network_rh_distros' from source: role '' defaults 15896 1727203909.00011: variable 'ansible_distribution_major_version' from source: facts 15896 1727203909.00013: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15896 1727203909.00185: variable 'ansible_distribution' from source: facts 15896 1727203909.00188: variable '__network_rh_distros' from source: role '' defaults 15896 1727203909.00190: variable 'ansible_distribution_major_version' from source: facts 15896 1727203909.00239: variable 'network_provider' from source: set_fact 15896 1727203909.00247: variable 'omit' from source: magic vars 15896 1727203909.00277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203909.00299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203909.00319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203909.00346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203909.00349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203909.00369: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203909.00372: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203909.00374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203909.00465: Set connection var ansible_shell_type to sh 15896 1727203909.00470: Set connection var ansible_connection to ssh 15896 1727203909.00477: Set connection var ansible_shell_executable to /bin/sh 15896 1727203909.00482: Set connection var ansible_pipelining to False 15896 1727203909.00487: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203909.00507: Set connection var ansible_timeout to 10 15896 1727203909.00521: variable 'ansible_shell_executable' from source: unknown 15896 1727203909.00524: variable 'ansible_connection' from source: unknown 15896 1727203909.00526: variable 'ansible_module_compression' from source: unknown 15896 1727203909.00534: variable 'ansible_shell_type' from source: unknown 15896 1727203909.00537: variable 'ansible_shell_executable' from source: unknown 15896 1727203909.00539: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203909.00541: variable 'ansible_pipelining' from source: unknown 15896 1727203909.00544: variable 'ansible_timeout' from source: unknown 15896 1727203909.00546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203909.00638: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203909.00647: variable 'omit' from source: magic vars 15896 1727203909.00650: starting attempt loop 15896 1727203909.00653: running the handler 15896 1727203909.00732: variable 'ansible_facts' from source: unknown 15896 1727203909.01360: _low_level_execute_command(): starting 15896 1727203909.01399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203909.01913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203909.01917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.01920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203909.01922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.01969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203909.01972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203909.01978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.02090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203909.03927: stdout chunk (state=3): >>>/root <<< 15896 1727203909.04008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203909.04041: stderr chunk (state=3): >>><<< 15896 1727203909.04044: stdout chunk (state=3): >>><<< 15896 1727203909.04070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203909.04158: _low_level_execute_command(): starting 15896 1727203909.04164: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878 `" && echo ansible-tmp-1727203909.0406928-20145-25939512050878="` echo /root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878 `" ) && sleep 0' 15896 1727203909.05031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203909.05047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203909.05063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203909.05083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203909.05101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203909.05114: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203909.05130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.05169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203909.05244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203909.05267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203909.05288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.05404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203909.07546: stdout chunk (state=3): >>>ansible-tmp-1727203909.0406928-20145-25939512050878=/root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878 <<< 15896 1727203909.07686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203909.07693: stdout chunk (state=3): >>><<< 15896 1727203909.07700: stderr chunk (state=3): >>><<< 15896 1727203909.07714: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203909.0406928-20145-25939512050878=/root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203909.07742: variable 'ansible_module_compression' from source: unknown 15896 1727203909.07781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15896 1727203909.08084: variable 'ansible_facts' from source: unknown 15896 1727203909.08087: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/AnsiballZ_systemd.py 15896 1727203909.08210: Sending initial data 15896 1727203909.08310: Sent initial data (155 bytes) 15896 1727203909.08833: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203909.08841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203909.08853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203909.08867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203909.08883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203909.08890: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203909.08899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.08913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203909.08920: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203909.09023: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.09027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203909.09030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203909.09032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.09202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203909.11422: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203909.11481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203909.11558: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmposmda2em /root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/AnsiballZ_systemd.py <<< 15896 1727203909.11568: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/AnsiballZ_systemd.py" <<< 15896 1727203909.11656: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmposmda2em" to remote "/root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/AnsiballZ_systemd.py" <<< 15896 1727203909.13772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203909.13779: stderr chunk (state=3): >>><<< 15896 1727203909.14021: stdout chunk (state=3): >>><<< 15896 1727203909.14024: done transferring module to remote 15896 1727203909.14026: _low_level_execute_command(): starting 15896 1727203909.14029: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/ /root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/AnsiballZ_systemd.py && sleep 0' 15896 1727203909.14695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203909.14709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203909.14788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.14883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203909.16879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203909.16899: stdout chunk (state=3): >>><<< 15896 1727203909.16913: stderr chunk (state=3): >>><<< 15896 1727203909.16931: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203909.16940: _low_level_execute_command(): starting 15896 1727203909.16949: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/AnsiballZ_systemd.py && sleep 0' 15896 1727203909.17527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203909.17541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.17597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203909.17610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.17701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203909.48704: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3293327360", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1116608000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 15896 1727203909.48709: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "networ<<< 15896 1727203909.48718: stdout chunk (state=3): >>>k-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15896 1727203909.51030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203909.51034: stderr chunk (state=3): >>><<< 15896 1727203909.51036: stdout chunk (state=3): >>><<< 15896 1727203909.51287: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3293327360", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1116608000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203909.51298: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203909.51301: _low_level_execute_command(): starting 15896 1727203909.51309: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203909.0406928-20145-25939512050878/ > /dev/null 2>&1 && sleep 0' 15896 1727203909.52017: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203909.52152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203909.52202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.52310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203909.54333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203909.54337: stdout chunk (state=3): >>><<< 15896 1727203909.54344: stderr chunk (state=3): >>><<< 15896 1727203909.54366: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203909.54373: handler run complete 15896 1727203909.54441: attempt loop complete, returning result 15896 1727203909.54445: _execute() done 15896 1727203909.54447: dumping result to json 15896 1727203909.54473: done dumping result, returning 15896 1727203909.54486: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-fb83-b6ad-000000000178] 15896 1727203909.54489: sending task result for task 028d2410-947f-fb83-b6ad-000000000178 15896 1727203909.54781: done sending task result for task 028d2410-947f-fb83-b6ad-000000000178 15896 1727203909.54784: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203909.54842: no more pending results, returning what we have 15896 1727203909.54845: results queue empty 15896 1727203909.54846: checking for any_errors_fatal 15896 1727203909.54852: done checking for any_errors_fatal 15896 1727203909.54852: checking for max_fail_percentage 15896 1727203909.54854: done checking for max_fail_percentage 15896 1727203909.54855: checking to see if all hosts have failed and the running result is not ok 15896 1727203909.54856: done checking to see if all hosts have failed 15896 1727203909.54857: getting the remaining hosts for this loop 15896 1727203909.54858: done getting the remaining hosts for this loop 15896 1727203909.54861: getting the next task for host managed-node1 15896 1727203909.54869: done getting next task for host managed-node1 15896 1727203909.54873: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203909.55090: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203909.55104: getting variables 15896 1727203909.55106: in VariableManager get_vars() 15896 1727203909.55147: Calling all_inventory to load vars for managed-node1 15896 1727203909.55150: Calling groups_inventory to load vars for managed-node1 15896 1727203909.55152: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203909.55160: Calling all_plugins_play to load vars for managed-node1 15896 1727203909.55163: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203909.55165: Calling groups_plugins_play to load vars for managed-node1 15896 1727203909.56780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203909.59542: done with get_vars() 15896 1727203909.59574: done getting variables 15896 1727203909.59745: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:51:49 -0400 (0:00:00.698) 0:00:55.187 ***** 15896 1727203909.59786: entering _queue_task() for managed-node1/service 15896 1727203909.60372: worker is 1 (out of 1 available) 15896 1727203909.60389: exiting _queue_task() for managed-node1/service 15896 1727203909.60399: done queuing things up, now waiting for results queue to drain 15896 1727203909.60401: waiting for pending results... 15896 1727203909.60790: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15896 1727203909.60801: in run() - task 028d2410-947f-fb83-b6ad-000000000179 15896 1727203909.60805: variable 'ansible_search_path' from source: unknown 15896 1727203909.60807: variable 'ansible_search_path' from source: unknown 15896 1727203909.60825: calling self._execute() 15896 1727203909.60940: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203909.61032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203909.61127: variable 'omit' from source: magic vars 15896 1727203909.61487: variable 'ansible_distribution_major_version' from source: facts 15896 1727203909.61504: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203909.61631: variable 'network_provider' from source: set_fact 15896 1727203909.61689: Evaluated conditional (network_provider == "nm"): True 15896 1727203909.61789: variable '__network_wpa_supplicant_required' from source: role '' defaults 15896 1727203909.61878: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15896 1727203909.62182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203909.65191: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203909.65278: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203909.65327: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203909.65368: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203909.65411: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203909.65523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203909.65554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203909.65583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203909.65630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203909.65645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203909.65701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203909.65735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203909.65764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203909.65816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203909.65916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203909.65921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203909.65924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203909.65934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203909.65979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203909.66000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203909.66168: variable 'network_connections' from source: task vars 15896 1727203909.66188: variable 'controller_profile' from source: play vars 15896 1727203909.66351: variable 'controller_profile' from source: play vars 15896 1727203909.66354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15896 1727203909.66522: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15896 1727203909.66568: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15896 1727203909.66609: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15896 1727203909.66642: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15896 1727203909.66701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15896 1727203909.66725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15896 1727203909.66752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203909.66783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15896 1727203909.66842: variable '__network_wireless_connections_defined' from source: role '' defaults 15896 1727203909.67089: variable 'network_connections' from source: task vars 15896 1727203909.67103: variable 'controller_profile' from source: play vars 15896 1727203909.67169: variable 'controller_profile' from source: play vars 15896 1727203909.67206: Evaluated conditional (__network_wpa_supplicant_required): False 15896 1727203909.67219: when evaluation is False, skipping this task 15896 1727203909.67231: _execute() done 15896 1727203909.67325: dumping result to json 15896 1727203909.67329: done dumping result, returning 15896 1727203909.67334: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-fb83-b6ad-000000000179] 15896 1727203909.67346: sending task result for task 028d2410-947f-fb83-b6ad-000000000179 15896 1727203909.67424: done sending task result for task 028d2410-947f-fb83-b6ad-000000000179 15896 1727203909.67429: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15896 1727203909.67484: no more pending results, returning what we have 15896 1727203909.67488: results queue empty 15896 1727203909.67489: checking for any_errors_fatal 15896 1727203909.67512: done checking for any_errors_fatal 15896 1727203909.67513: checking for max_fail_percentage 15896 1727203909.67515: done checking for max_fail_percentage 15896 1727203909.67516: checking to see if all hosts have failed and the running result is not ok 15896 1727203909.67517: done checking to see if all hosts have failed 15896 1727203909.67518: getting the remaining hosts for this loop 15896 1727203909.67520: done getting the remaining hosts for this loop 15896 1727203909.67523: getting the next task for host managed-node1 15896 1727203909.67531: done getting next task for host managed-node1 15896 1727203909.67535: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203909.67539: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203909.67564: getting variables 15896 1727203909.67566: in VariableManager get_vars() 15896 1727203909.67623: Calling all_inventory to load vars for managed-node1 15896 1727203909.67626: Calling groups_inventory to load vars for managed-node1 15896 1727203909.67629: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203909.67640: Calling all_plugins_play to load vars for managed-node1 15896 1727203909.67643: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203909.67646: Calling groups_plugins_play to load vars for managed-node1 15896 1727203909.69097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203909.69967: done with get_vars() 15896 1727203909.69985: done getting variables 15896 1727203909.70026: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:51:49 -0400 (0:00:00.102) 0:00:55.289 ***** 15896 1727203909.70050: entering _queue_task() for managed-node1/service 15896 1727203909.70284: worker is 1 (out of 1 available) 15896 1727203909.70295: exiting _queue_task() for managed-node1/service 15896 1727203909.70308: done queuing things up, now waiting for results queue to drain 15896 1727203909.70309: waiting for pending results... 15896 1727203909.70704: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 15896 1727203909.70708: in run() - task 028d2410-947f-fb83-b6ad-00000000017a 15896 1727203909.70712: variable 'ansible_search_path' from source: unknown 15896 1727203909.70715: variable 'ansible_search_path' from source: unknown 15896 1727203909.70718: calling self._execute() 15896 1727203909.70888: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203909.70892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203909.70895: variable 'omit' from source: magic vars 15896 1727203909.71182: variable 'ansible_distribution_major_version' from source: facts 15896 1727203909.71193: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203909.71305: variable 'network_provider' from source: set_fact 15896 1727203909.71308: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203909.71320: when evaluation is False, skipping this task 15896 1727203909.71323: _execute() done 15896 1727203909.71326: dumping result to json 15896 1727203909.71328: done dumping result, returning 15896 1727203909.71330: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-fb83-b6ad-00000000017a] 15896 1727203909.71333: sending task result for task 028d2410-947f-fb83-b6ad-00000000017a 15896 1727203909.71432: done sending task result for task 028d2410-947f-fb83-b6ad-00000000017a 15896 1727203909.71436: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15896 1727203909.71586: no more pending results, returning what we have 15896 1727203909.71589: results queue empty 15896 1727203909.71589: checking for any_errors_fatal 15896 1727203909.71595: done checking for any_errors_fatal 15896 1727203909.71595: checking for max_fail_percentage 15896 1727203909.71597: done checking for max_fail_percentage 15896 1727203909.71598: checking to see if all hosts have failed and the running result is not ok 15896 1727203909.71598: done checking to see if all hosts have failed 15896 1727203909.71599: getting the remaining hosts for this loop 15896 1727203909.71600: done getting the remaining hosts for this loop 15896 1727203909.71603: getting the next task for host managed-node1 15896 1727203909.71609: done getting next task for host managed-node1 15896 1727203909.71612: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203909.71615: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203909.71632: getting variables 15896 1727203909.71634: in VariableManager get_vars() 15896 1727203909.71685: Calling all_inventory to load vars for managed-node1 15896 1727203909.71688: Calling groups_inventory to load vars for managed-node1 15896 1727203909.71698: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203909.71707: Calling all_plugins_play to load vars for managed-node1 15896 1727203909.71710: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203909.71713: Calling groups_plugins_play to load vars for managed-node1 15896 1727203909.72600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203909.73467: done with get_vars() 15896 1727203909.73486: done getting variables 15896 1727203909.73526: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:51:49 -0400 (0:00:00.034) 0:00:55.324 ***** 15896 1727203909.73549: entering _queue_task() for managed-node1/copy 15896 1727203909.73832: worker is 1 (out of 1 available) 15896 1727203909.73843: exiting _queue_task() for managed-node1/copy 15896 1727203909.73856: done queuing things up, now waiting for results queue to drain 15896 1727203909.73857: waiting for pending results... 15896 1727203909.74295: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15896 1727203909.74299: in run() - task 028d2410-947f-fb83-b6ad-00000000017b 15896 1727203909.74316: variable 'ansible_search_path' from source: unknown 15896 1727203909.74323: variable 'ansible_search_path' from source: unknown 15896 1727203909.74356: calling self._execute() 15896 1727203909.74459: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203909.74501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203909.74515: variable 'omit' from source: magic vars 15896 1727203909.75199: variable 'ansible_distribution_major_version' from source: facts 15896 1727203909.75202: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203909.75320: variable 'network_provider' from source: set_fact 15896 1727203909.75330: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203909.75336: when evaluation is False, skipping this task 15896 1727203909.75341: _execute() done 15896 1727203909.75347: dumping result to json 15896 1727203909.75483: done dumping result, returning 15896 1727203909.75486: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-fb83-b6ad-00000000017b] 15896 1727203909.75488: sending task result for task 028d2410-947f-fb83-b6ad-00000000017b 15896 1727203909.75555: done sending task result for task 028d2410-947f-fb83-b6ad-00000000017b 15896 1727203909.75558: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203909.75629: no more pending results, returning what we have 15896 1727203909.75633: results queue empty 15896 1727203909.75634: checking for any_errors_fatal 15896 1727203909.75641: done checking for any_errors_fatal 15896 1727203909.75641: checking for max_fail_percentage 15896 1727203909.75643: done checking for max_fail_percentage 15896 1727203909.75644: checking to see if all hosts have failed and the running result is not ok 15896 1727203909.75645: done checking to see if all hosts have failed 15896 1727203909.75645: getting the remaining hosts for this loop 15896 1727203909.75647: done getting the remaining hosts for this loop 15896 1727203909.75650: getting the next task for host managed-node1 15896 1727203909.75657: done getting next task for host managed-node1 15896 1727203909.75661: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203909.75665: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203909.75690: getting variables 15896 1727203909.75692: in VariableManager get_vars() 15896 1727203909.75742: Calling all_inventory to load vars for managed-node1 15896 1727203909.75745: Calling groups_inventory to load vars for managed-node1 15896 1727203909.75747: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203909.75758: Calling all_plugins_play to load vars for managed-node1 15896 1727203909.75762: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203909.75764: Calling groups_plugins_play to load vars for managed-node1 15896 1727203909.77093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203909.78143: done with get_vars() 15896 1727203909.78171: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:51:49 -0400 (0:00:00.047) 0:00:55.371 ***** 15896 1727203909.78255: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203909.78571: worker is 1 (out of 1 available) 15896 1727203909.78585: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 15896 1727203909.78598: done queuing things up, now waiting for results queue to drain 15896 1727203909.78599: waiting for pending results... 15896 1727203909.78908: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15896 1727203909.79110: in run() - task 028d2410-947f-fb83-b6ad-00000000017c 15896 1727203909.79114: variable 'ansible_search_path' from source: unknown 15896 1727203909.79116: variable 'ansible_search_path' from source: unknown 15896 1727203909.79120: calling self._execute() 15896 1727203909.79382: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203909.79387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203909.79390: variable 'omit' from source: magic vars 15896 1727203909.79559: variable 'ansible_distribution_major_version' from source: facts 15896 1727203909.79572: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203909.79580: variable 'omit' from source: magic vars 15896 1727203909.79644: variable 'omit' from source: magic vars 15896 1727203909.79802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15896 1727203909.81312: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15896 1727203909.81360: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15896 1727203909.81388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15896 1727203909.81416: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15896 1727203909.81437: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15896 1727203909.81501: variable 'network_provider' from source: set_fact 15896 1727203909.81593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15896 1727203909.81623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15896 1727203909.81641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15896 1727203909.81669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15896 1727203909.81685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15896 1727203909.81795: variable 'omit' from source: magic vars 15896 1727203909.81855: variable 'omit' from source: magic vars 15896 1727203909.81953: variable 'network_connections' from source: task vars 15896 1727203909.81965: variable 'controller_profile' from source: play vars 15896 1727203909.82199: variable 'controller_profile' from source: play vars 15896 1727203909.82202: variable 'omit' from source: magic vars 15896 1727203909.82205: variable '__lsr_ansible_managed' from source: task vars 15896 1727203909.82236: variable '__lsr_ansible_managed' from source: task vars 15896 1727203909.82400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15896 1727203909.82610: Loaded config def from plugin (lookup/template) 15896 1727203909.82613: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15896 1727203909.82640: File lookup term: get_ansible_managed.j2 15896 1727203909.82644: variable 'ansible_search_path' from source: unknown 15896 1727203909.82647: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15896 1727203909.82663: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15896 1727203909.82683: variable 'ansible_search_path' from source: unknown 15896 1727203909.87026: variable 'ansible_managed' from source: unknown 15896 1727203909.87180: variable 'omit' from source: magic vars 15896 1727203909.87228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203909.87232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203909.87282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203909.87285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203909.87287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203909.87343: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203909.87347: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203909.87350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203909.87449: Set connection var ansible_shell_type to sh 15896 1727203909.87453: Set connection var ansible_connection to ssh 15896 1727203909.87455: Set connection var ansible_shell_executable to /bin/sh 15896 1727203909.87458: Set connection var ansible_pipelining to False 15896 1727203909.87466: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203909.87468: Set connection var ansible_timeout to 10 15896 1727203909.87544: variable 'ansible_shell_executable' from source: unknown 15896 1727203909.87547: variable 'ansible_connection' from source: unknown 15896 1727203909.87550: variable 'ansible_module_compression' from source: unknown 15896 1727203909.87552: variable 'ansible_shell_type' from source: unknown 15896 1727203909.87554: variable 'ansible_shell_executable' from source: unknown 15896 1727203909.87556: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203909.87558: variable 'ansible_pipelining' from source: unknown 15896 1727203909.87562: variable 'ansible_timeout' from source: unknown 15896 1727203909.87564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203909.87710: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203909.87807: variable 'omit' from source: magic vars 15896 1727203909.87810: starting attempt loop 15896 1727203909.87813: running the handler 15896 1727203909.87815: _low_level_execute_command(): starting 15896 1727203909.87818: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203909.88336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203909.88394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.88455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203909.88464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203909.88507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.88601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203909.90375: stdout chunk (state=3): >>>/root <<< 15896 1727203909.90485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203909.90519: stderr chunk (state=3): >>><<< 15896 1727203909.90522: stdout chunk (state=3): >>><<< 15896 1727203909.90541: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203909.90553: _low_level_execute_command(): starting 15896 1727203909.90560: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097 `" && echo ansible-tmp-1727203909.9054267-20322-181270544038097="` echo /root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097 `" ) && sleep 0' 15896 1727203909.91025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203909.91029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.91031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203909.91033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.91093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.91172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203909.93283: stdout chunk (state=3): >>>ansible-tmp-1727203909.9054267-20322-181270544038097=/root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097 <<< 15896 1727203909.93621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203909.93625: stdout chunk (state=3): >>><<< 15896 1727203909.93628: stderr chunk (state=3): >>><<< 15896 1727203909.93631: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203909.9054267-20322-181270544038097=/root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203909.93634: variable 'ansible_module_compression' from source: unknown 15896 1727203909.93636: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15896 1727203909.93749: variable 'ansible_facts' from source: unknown 15896 1727203909.94051: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/AnsiballZ_network_connections.py 15896 1727203909.94105: Sending initial data 15896 1727203909.94108: Sent initial data (168 bytes) 15896 1727203909.95023: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203909.95043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203909.95272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203909.95282: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.95285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203909.95287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203909.95335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.95406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203909.97164: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203909.97240: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203909.97316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp9urn6ap2 /root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/AnsiballZ_network_connections.py <<< 15896 1727203909.97320: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/AnsiballZ_network_connections.py" <<< 15896 1727203909.97406: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp9urn6ap2" to remote "/root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/AnsiballZ_network_connections.py" <<< 15896 1727203909.98935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203909.98954: stderr chunk (state=3): >>><<< 15896 1727203909.98968: stdout chunk (state=3): >>><<< 15896 1727203909.99027: done transferring module to remote 15896 1727203909.99036: _low_level_execute_command(): starting 15896 1727203909.99041: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/ /root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/AnsiballZ_network_connections.py && sleep 0' 15896 1727203909.99632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203909.99636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.99642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203909.99648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203909.99707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203909.99785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203910.01807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203910.01877: stderr chunk (state=3): >>><<< 15896 1727203910.01882: stdout chunk (state=3): >>><<< 15896 1727203910.01921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203910.01925: _low_level_execute_command(): starting 15896 1727203910.01972: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/AnsiballZ_network_connections.py && sleep 0' 15896 1727203910.02583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203910.02589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203910.02618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.02622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203910.02624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203910.02626: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.02679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203910.02682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203910.02779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203910.45722: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_5d9ii29y/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_5d9ii29y/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/8404d61d-4c80-4763-affe-7d26fa7e8dd3: error=unknown <<< 15896 1727203910.46207: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15896 1727203910.49070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203910.49074: stdout chunk (state=3): >>><<< 15896 1727203910.49078: stderr chunk (state=3): >>><<< 15896 1727203910.49083: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_5d9ii29y/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_5d9ii29y/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/8404d61d-4c80-4763-affe-7d26fa7e8dd3: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203910.49085: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'down', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203910.49087: _low_level_execute_command(): starting 15896 1727203910.49089: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203909.9054267-20322-181270544038097/ > /dev/null 2>&1 && sleep 0' 15896 1727203910.49698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203910.49793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.49816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203910.49833: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203910.49855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203910.49967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203910.52045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203910.52089: stdout chunk (state=3): >>><<< 15896 1727203910.52126: stderr chunk (state=3): >>><<< 15896 1727203910.52169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203910.52183: handler run complete 15896 1727203910.52213: attempt loop complete, returning result 15896 1727203910.52219: _execute() done 15896 1727203910.52225: dumping result to json 15896 1727203910.52233: done dumping result, returning 15896 1727203910.52246: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-fb83-b6ad-00000000017c] 15896 1727203910.52260: sending task result for task 028d2410-947f-fb83-b6ad-00000000017c changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15896 1727203910.52489: no more pending results, returning what we have 15896 1727203910.52493: results queue empty 15896 1727203910.52494: checking for any_errors_fatal 15896 1727203910.52501: done checking for any_errors_fatal 15896 1727203910.52501: checking for max_fail_percentage 15896 1727203910.52503: done checking for max_fail_percentage 15896 1727203910.52504: checking to see if all hosts have failed and the running result is not ok 15896 1727203910.52505: done checking to see if all hosts have failed 15896 1727203910.52505: getting the remaining hosts for this loop 15896 1727203910.52507: done getting the remaining hosts for this loop 15896 1727203910.52510: getting the next task for host managed-node1 15896 1727203910.52517: done getting next task for host managed-node1 15896 1727203910.52520: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203910.52524: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203910.52535: getting variables 15896 1727203910.52537: in VariableManager get_vars() 15896 1727203910.52803: Calling all_inventory to load vars for managed-node1 15896 1727203910.52806: Calling groups_inventory to load vars for managed-node1 15896 1727203910.52808: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203910.52815: done sending task result for task 028d2410-947f-fb83-b6ad-00000000017c 15896 1727203910.52817: WORKER PROCESS EXITING 15896 1727203910.52826: Calling all_plugins_play to load vars for managed-node1 15896 1727203910.52830: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203910.52833: Calling groups_plugins_play to load vars for managed-node1 15896 1727203910.54284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203910.56040: done with get_vars() 15896 1727203910.56082: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:51:50 -0400 (0:00:00.779) 0:00:56.151 ***** 15896 1727203910.56198: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203910.56718: worker is 1 (out of 1 available) 15896 1727203910.56736: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 15896 1727203910.56748: done queuing things up, now waiting for results queue to drain 15896 1727203910.56749: waiting for pending results... 15896 1727203910.57025: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 15896 1727203910.57219: in run() - task 028d2410-947f-fb83-b6ad-00000000017d 15896 1727203910.57241: variable 'ansible_search_path' from source: unknown 15896 1727203910.57249: variable 'ansible_search_path' from source: unknown 15896 1727203910.57298: calling self._execute() 15896 1727203910.57417: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.57429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.57444: variable 'omit' from source: magic vars 15896 1727203910.57878: variable 'ansible_distribution_major_version' from source: facts 15896 1727203910.57897: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203910.58051: variable 'network_state' from source: role '' defaults 15896 1727203910.58067: Evaluated conditional (network_state != {}): False 15896 1727203910.58075: when evaluation is False, skipping this task 15896 1727203910.58083: _execute() done 15896 1727203910.58096: dumping result to json 15896 1727203910.58105: done dumping result, returning 15896 1727203910.58118: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-fb83-b6ad-00000000017d] 15896 1727203910.58131: sending task result for task 028d2410-947f-fb83-b6ad-00000000017d skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15896 1727203910.58442: no more pending results, returning what we have 15896 1727203910.58448: results queue empty 15896 1727203910.58449: checking for any_errors_fatal 15896 1727203910.58458: done checking for any_errors_fatal 15896 1727203910.58459: checking for max_fail_percentage 15896 1727203910.58461: done checking for max_fail_percentage 15896 1727203910.58462: checking to see if all hosts have failed and the running result is not ok 15896 1727203910.58463: done checking to see if all hosts have failed 15896 1727203910.58464: getting the remaining hosts for this loop 15896 1727203910.58466: done getting the remaining hosts for this loop 15896 1727203910.58470: getting the next task for host managed-node1 15896 1727203910.58479: done getting next task for host managed-node1 15896 1727203910.58484: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203910.58492: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203910.58519: getting variables 15896 1727203910.58521: in VariableManager get_vars() 15896 1727203910.58574: Calling all_inventory to load vars for managed-node1 15896 1727203910.58694: Calling groups_inventory to load vars for managed-node1 15896 1727203910.58697: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203910.58708: Calling all_plugins_play to load vars for managed-node1 15896 1727203910.58711: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203910.58714: Calling groups_plugins_play to load vars for managed-node1 15896 1727203910.59391: done sending task result for task 028d2410-947f-fb83-b6ad-00000000017d 15896 1727203910.59394: WORKER PROCESS EXITING 15896 1727203910.60442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203910.62077: done with get_vars() 15896 1727203910.62107: done getting variables 15896 1727203910.62194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:51:50 -0400 (0:00:00.060) 0:00:56.211 ***** 15896 1727203910.62236: entering _queue_task() for managed-node1/debug 15896 1727203910.62718: worker is 1 (out of 1 available) 15896 1727203910.62841: exiting _queue_task() for managed-node1/debug 15896 1727203910.62853: done queuing things up, now waiting for results queue to drain 15896 1727203910.62854: waiting for pending results... 15896 1727203910.63032: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15896 1727203910.63230: in run() - task 028d2410-947f-fb83-b6ad-00000000017e 15896 1727203910.63265: variable 'ansible_search_path' from source: unknown 15896 1727203910.63274: variable 'ansible_search_path' from source: unknown 15896 1727203910.63328: calling self._execute() 15896 1727203910.63446: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.63463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.63492: variable 'omit' from source: magic vars 15896 1727203910.63927: variable 'ansible_distribution_major_version' from source: facts 15896 1727203910.63945: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203910.63966: variable 'omit' from source: magic vars 15896 1727203910.64040: variable 'omit' from source: magic vars 15896 1727203910.64098: variable 'omit' from source: magic vars 15896 1727203910.64155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203910.64212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203910.64247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203910.64274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203910.64298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203910.64339: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203910.64393: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.64397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.64499: Set connection var ansible_shell_type to sh 15896 1727203910.64514: Set connection var ansible_connection to ssh 15896 1727203910.64525: Set connection var ansible_shell_executable to /bin/sh 15896 1727203910.64534: Set connection var ansible_pipelining to False 15896 1727203910.64544: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203910.64554: Set connection var ansible_timeout to 10 15896 1727203910.64611: variable 'ansible_shell_executable' from source: unknown 15896 1727203910.64614: variable 'ansible_connection' from source: unknown 15896 1727203910.64617: variable 'ansible_module_compression' from source: unknown 15896 1727203910.64619: variable 'ansible_shell_type' from source: unknown 15896 1727203910.64623: variable 'ansible_shell_executable' from source: unknown 15896 1727203910.64632: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.64639: variable 'ansible_pipelining' from source: unknown 15896 1727203910.64646: variable 'ansible_timeout' from source: unknown 15896 1727203910.64654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.64808: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203910.64880: variable 'omit' from source: magic vars 15896 1727203910.64883: starting attempt loop 15896 1727203910.64886: running the handler 15896 1727203910.64992: variable '__network_connections_result' from source: set_fact 15896 1727203910.65057: handler run complete 15896 1727203910.65086: attempt loop complete, returning result 15896 1727203910.65094: _execute() done 15896 1727203910.65103: dumping result to json 15896 1727203910.65111: done dumping result, returning 15896 1727203910.65126: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-fb83-b6ad-00000000017e] 15896 1727203910.65157: sending task result for task 028d2410-947f-fb83-b6ad-00000000017e ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15896 1727203910.65330: no more pending results, returning what we have 15896 1727203910.65335: results queue empty 15896 1727203910.65336: checking for any_errors_fatal 15896 1727203910.65342: done checking for any_errors_fatal 15896 1727203910.65343: checking for max_fail_percentage 15896 1727203910.65344: done checking for max_fail_percentage 15896 1727203910.65345: checking to see if all hosts have failed and the running result is not ok 15896 1727203910.65346: done checking to see if all hosts have failed 15896 1727203910.65347: getting the remaining hosts for this loop 15896 1727203910.65348: done getting the remaining hosts for this loop 15896 1727203910.65351: getting the next task for host managed-node1 15896 1727203910.65361: done getting next task for host managed-node1 15896 1727203910.65368: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203910.65372: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203910.65388: getting variables 15896 1727203910.65390: in VariableManager get_vars() 15896 1727203910.65443: Calling all_inventory to load vars for managed-node1 15896 1727203910.65446: Calling groups_inventory to load vars for managed-node1 15896 1727203910.65449: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203910.65462: Calling all_plugins_play to load vars for managed-node1 15896 1727203910.65465: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203910.65469: Calling groups_plugins_play to load vars for managed-node1 15896 1727203910.65693: done sending task result for task 028d2410-947f-fb83-b6ad-00000000017e 15896 1727203910.65697: WORKER PROCESS EXITING 15896 1727203910.67330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203910.69025: done with get_vars() 15896 1727203910.69067: done getting variables 15896 1727203910.69139: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:51:50 -0400 (0:00:00.069) 0:00:56.281 ***** 15896 1727203910.69189: entering _queue_task() for managed-node1/debug 15896 1727203910.69651: worker is 1 (out of 1 available) 15896 1727203910.69663: exiting _queue_task() for managed-node1/debug 15896 1727203910.69674: done queuing things up, now waiting for results queue to drain 15896 1727203910.69814: waiting for pending results... 15896 1727203910.69974: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15896 1727203910.70140: in run() - task 028d2410-947f-fb83-b6ad-00000000017f 15896 1727203910.70172: variable 'ansible_search_path' from source: unknown 15896 1727203910.70182: variable 'ansible_search_path' from source: unknown 15896 1727203910.70220: calling self._execute() 15896 1727203910.70358: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.70375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.70379: variable 'omit' from source: magic vars 15896 1727203910.70880: variable 'ansible_distribution_major_version' from source: facts 15896 1727203910.70883: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203910.70886: variable 'omit' from source: magic vars 15896 1727203910.70927: variable 'omit' from source: magic vars 15896 1727203910.70969: variable 'omit' from source: magic vars 15896 1727203910.71030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203910.71074: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203910.71103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203910.71125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203910.71151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203910.71247: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203910.71251: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.71253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.71318: Set connection var ansible_shell_type to sh 15896 1727203910.71331: Set connection var ansible_connection to ssh 15896 1727203910.71341: Set connection var ansible_shell_executable to /bin/sh 15896 1727203910.71358: Set connection var ansible_pipelining to False 15896 1727203910.71373: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203910.71389: Set connection var ansible_timeout to 10 15896 1727203910.71417: variable 'ansible_shell_executable' from source: unknown 15896 1727203910.71468: variable 'ansible_connection' from source: unknown 15896 1727203910.71471: variable 'ansible_module_compression' from source: unknown 15896 1727203910.71473: variable 'ansible_shell_type' from source: unknown 15896 1727203910.71477: variable 'ansible_shell_executable' from source: unknown 15896 1727203910.71480: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.71481: variable 'ansible_pipelining' from source: unknown 15896 1727203910.71483: variable 'ansible_timeout' from source: unknown 15896 1727203910.71485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.71619: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203910.71635: variable 'omit' from source: magic vars 15896 1727203910.71685: starting attempt loop 15896 1727203910.71688: running the handler 15896 1727203910.71710: variable '__network_connections_result' from source: set_fact 15896 1727203910.71801: variable '__network_connections_result' from source: set_fact 15896 1727203910.71918: handler run complete 15896 1727203910.71950: attempt loop complete, returning result 15896 1727203910.71956: _execute() done 15896 1727203910.71966: dumping result to json 15896 1727203910.71980: done dumping result, returning 15896 1727203910.72011: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-fb83-b6ad-00000000017f] 15896 1727203910.72014: sending task result for task 028d2410-947f-fb83-b6ad-00000000017f ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15896 1727203910.72216: no more pending results, returning what we have 15896 1727203910.72221: results queue empty 15896 1727203910.72222: checking for any_errors_fatal 15896 1727203910.72229: done checking for any_errors_fatal 15896 1727203910.72230: checking for max_fail_percentage 15896 1727203910.72231: done checking for max_fail_percentage 15896 1727203910.72233: checking to see if all hosts have failed and the running result is not ok 15896 1727203910.72233: done checking to see if all hosts have failed 15896 1727203910.72234: getting the remaining hosts for this loop 15896 1727203910.72235: done getting the remaining hosts for this loop 15896 1727203910.72238: getting the next task for host managed-node1 15896 1727203910.72246: done getting next task for host managed-node1 15896 1727203910.72250: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203910.72254: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203910.72269: getting variables 15896 1727203910.72271: in VariableManager get_vars() 15896 1727203910.72324: Calling all_inventory to load vars for managed-node1 15896 1727203910.72327: Calling groups_inventory to load vars for managed-node1 15896 1727203910.72332: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203910.72342: Calling all_plugins_play to load vars for managed-node1 15896 1727203910.72345: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203910.72348: Calling groups_plugins_play to load vars for managed-node1 15896 1727203910.72889: done sending task result for task 028d2410-947f-fb83-b6ad-00000000017f 15896 1727203910.72893: WORKER PROCESS EXITING 15896 1727203910.73307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203910.74386: done with get_vars() 15896 1727203910.74417: done getting variables 15896 1727203910.74483: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:51:50 -0400 (0:00:00.053) 0:00:56.334 ***** 15896 1727203910.74516: entering _queue_task() for managed-node1/debug 15896 1727203910.74873: worker is 1 (out of 1 available) 15896 1727203910.74890: exiting _queue_task() for managed-node1/debug 15896 1727203910.74903: done queuing things up, now waiting for results queue to drain 15896 1727203910.74907: waiting for pending results... 15896 1727203910.75207: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15896 1727203910.75314: in run() - task 028d2410-947f-fb83-b6ad-000000000180 15896 1727203910.75328: variable 'ansible_search_path' from source: unknown 15896 1727203910.75332: variable 'ansible_search_path' from source: unknown 15896 1727203910.75378: calling self._execute() 15896 1727203910.75464: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.75469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.75486: variable 'omit' from source: magic vars 15896 1727203910.75819: variable 'ansible_distribution_major_version' from source: facts 15896 1727203910.75922: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203910.75942: variable 'network_state' from source: role '' defaults 15896 1727203910.75948: Evaluated conditional (network_state != {}): False 15896 1727203910.75951: when evaluation is False, skipping this task 15896 1727203910.75954: _execute() done 15896 1727203910.75956: dumping result to json 15896 1727203910.75963: done dumping result, returning 15896 1727203910.75970: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-fb83-b6ad-000000000180] 15896 1727203910.75981: sending task result for task 028d2410-947f-fb83-b6ad-000000000180 skipping: [managed-node1] => { "false_condition": "network_state != {}" } 15896 1727203910.76150: no more pending results, returning what we have 15896 1727203910.76154: results queue empty 15896 1727203910.76155: checking for any_errors_fatal 15896 1727203910.76166: done checking for any_errors_fatal 15896 1727203910.76167: checking for max_fail_percentage 15896 1727203910.76170: done checking for max_fail_percentage 15896 1727203910.76171: checking to see if all hosts have failed and the running result is not ok 15896 1727203910.76172: done checking to see if all hosts have failed 15896 1727203910.76172: getting the remaining hosts for this loop 15896 1727203910.76174: done getting the remaining hosts for this loop 15896 1727203910.76180: getting the next task for host managed-node1 15896 1727203910.76190: done getting next task for host managed-node1 15896 1727203910.76194: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203910.76199: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203910.76221: getting variables 15896 1727203910.76223: in VariableManager get_vars() 15896 1727203910.76288: Calling all_inventory to load vars for managed-node1 15896 1727203910.76291: Calling groups_inventory to load vars for managed-node1 15896 1727203910.76293: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203910.76299: done sending task result for task 028d2410-947f-fb83-b6ad-000000000180 15896 1727203910.76301: WORKER PROCESS EXITING 15896 1727203910.76311: Calling all_plugins_play to load vars for managed-node1 15896 1727203910.76314: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203910.76316: Calling groups_plugins_play to load vars for managed-node1 15896 1727203910.77493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203910.78360: done with get_vars() 15896 1727203910.78379: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:51:50 -0400 (0:00:00.039) 0:00:56.373 ***** 15896 1727203910.78449: entering _queue_task() for managed-node1/ping 15896 1727203910.78698: worker is 1 (out of 1 available) 15896 1727203910.78710: exiting _queue_task() for managed-node1/ping 15896 1727203910.78722: done queuing things up, now waiting for results queue to drain 15896 1727203910.78723: waiting for pending results... 15896 1727203910.78915: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15896 1727203910.79289: in run() - task 028d2410-947f-fb83-b6ad-000000000181 15896 1727203910.79293: variable 'ansible_search_path' from source: unknown 15896 1727203910.79296: variable 'ansible_search_path' from source: unknown 15896 1727203910.79298: calling self._execute() 15896 1727203910.79301: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.79303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.79305: variable 'omit' from source: magic vars 15896 1727203910.79657: variable 'ansible_distribution_major_version' from source: facts 15896 1727203910.79675: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203910.79689: variable 'omit' from source: magic vars 15896 1727203910.79764: variable 'omit' from source: magic vars 15896 1727203910.79807: variable 'omit' from source: magic vars 15896 1727203910.79861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203910.79903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203910.79932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203910.79960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203910.79981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203910.80039: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203910.80049: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.80052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.80146: Set connection var ansible_shell_type to sh 15896 1727203910.80156: Set connection var ansible_connection to ssh 15896 1727203910.80165: Set connection var ansible_shell_executable to /bin/sh 15896 1727203910.80170: Set connection var ansible_pipelining to False 15896 1727203910.80176: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203910.80183: Set connection var ansible_timeout to 10 15896 1727203910.80199: variable 'ansible_shell_executable' from source: unknown 15896 1727203910.80202: variable 'ansible_connection' from source: unknown 15896 1727203910.80205: variable 'ansible_module_compression' from source: unknown 15896 1727203910.80207: variable 'ansible_shell_type' from source: unknown 15896 1727203910.80209: variable 'ansible_shell_executable' from source: unknown 15896 1727203910.80211: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203910.80216: variable 'ansible_pipelining' from source: unknown 15896 1727203910.80218: variable 'ansible_timeout' from source: unknown 15896 1727203910.80222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203910.80383: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15896 1727203910.80392: variable 'omit' from source: magic vars 15896 1727203910.80398: starting attempt loop 15896 1727203910.80400: running the handler 15896 1727203910.80412: _low_level_execute_command(): starting 15896 1727203910.80419: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203910.80929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203910.80933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.80937: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203910.80940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.80982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203910.80985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203910.80995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203910.81082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203910.82866: stdout chunk (state=3): >>>/root <<< 15896 1727203910.82968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203910.82997: stderr chunk (state=3): >>><<< 15896 1727203910.83000: stdout chunk (state=3): >>><<< 15896 1727203910.83021: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203910.83033: _low_level_execute_command(): starting 15896 1727203910.83039: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847 `" && echo ansible-tmp-1727203910.830201-20414-163038978227847="` echo /root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847 `" ) && sleep 0' 15896 1727203910.83463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203910.83480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203910.83484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.83514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.83562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203910.83568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203910.83569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203910.83648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203910.85721: stdout chunk (state=3): >>>ansible-tmp-1727203910.830201-20414-163038978227847=/root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847 <<< 15896 1727203910.85828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203910.85857: stderr chunk (state=3): >>><<< 15896 1727203910.85863: stdout chunk (state=3): >>><<< 15896 1727203910.85885: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203910.830201-20414-163038978227847=/root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203910.85923: variable 'ansible_module_compression' from source: unknown 15896 1727203910.85956: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15896 1727203910.85991: variable 'ansible_facts' from source: unknown 15896 1727203910.86043: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/AnsiballZ_ping.py 15896 1727203910.86147: Sending initial data 15896 1727203910.86151: Sent initial data (152 bytes) 15896 1727203910.86578: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203910.86608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203910.86611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203910.86613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.86616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203910.86618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.86681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203910.86684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203910.86686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203910.86757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203910.88488: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203910.88562: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203910.88640: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmplnq6vikx /root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/AnsiballZ_ping.py <<< 15896 1727203910.88643: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/AnsiballZ_ping.py" <<< 15896 1727203910.88713: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmplnq6vikx" to remote "/root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/AnsiballZ_ping.py" <<< 15896 1727203910.88718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/AnsiballZ_ping.py" <<< 15896 1727203910.89380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203910.89423: stderr chunk (state=3): >>><<< 15896 1727203910.89426: stdout chunk (state=3): >>><<< 15896 1727203910.89473: done transferring module to remote 15896 1727203910.89484: _low_level_execute_command(): starting 15896 1727203910.89490: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/ /root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/AnsiballZ_ping.py && sleep 0' 15896 1727203910.89938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203910.89942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203910.89944: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.89946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203910.89952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.90005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203910.90013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203910.90088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203910.92046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203910.92073: stderr chunk (state=3): >>><<< 15896 1727203910.92078: stdout chunk (state=3): >>><<< 15896 1727203910.92092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203910.92094: _low_level_execute_command(): starting 15896 1727203910.92100: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/AnsiballZ_ping.py && sleep 0' 15896 1727203910.92542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203910.92545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203910.92548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.92550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203910.92552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203910.92605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203910.92612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203910.92696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.09008: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15896 1727203911.10523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203911.10554: stderr chunk (state=3): >>><<< 15896 1727203911.10557: stdout chunk (state=3): >>><<< 15896 1727203911.10579: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203911.10605: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203911.10610: _low_level_execute_command(): starting 15896 1727203911.10615: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203910.830201-20414-163038978227847/ > /dev/null 2>&1 && sleep 0' 15896 1727203911.11068: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203911.11072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203911.11077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.11089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.11147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203911.11150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.11157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.11233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.13172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.13200: stderr chunk (state=3): >>><<< 15896 1727203911.13203: stdout chunk (state=3): >>><<< 15896 1727203911.13221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203911.13224: handler run complete 15896 1727203911.13239: attempt loop complete, returning result 15896 1727203911.13242: _execute() done 15896 1727203911.13244: dumping result to json 15896 1727203911.13246: done dumping result, returning 15896 1727203911.13254: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-fb83-b6ad-000000000181] 15896 1727203911.13257: sending task result for task 028d2410-947f-fb83-b6ad-000000000181 15896 1727203911.13347: done sending task result for task 028d2410-947f-fb83-b6ad-000000000181 15896 1727203911.13350: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 15896 1727203911.13419: no more pending results, returning what we have 15896 1727203911.13423: results queue empty 15896 1727203911.13423: checking for any_errors_fatal 15896 1727203911.13429: done checking for any_errors_fatal 15896 1727203911.13430: checking for max_fail_percentage 15896 1727203911.13432: done checking for max_fail_percentage 15896 1727203911.13432: checking to see if all hosts have failed and the running result is not ok 15896 1727203911.13433: done checking to see if all hosts have failed 15896 1727203911.13434: getting the remaining hosts for this loop 15896 1727203911.13435: done getting the remaining hosts for this loop 15896 1727203911.13438: getting the next task for host managed-node1 15896 1727203911.13448: done getting next task for host managed-node1 15896 1727203911.13450: ^ task is: TASK: meta (role_complete) 15896 1727203911.13455: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203911.13470: getting variables 15896 1727203911.13472: in VariableManager get_vars() 15896 1727203911.13523: Calling all_inventory to load vars for managed-node1 15896 1727203911.13526: Calling groups_inventory to load vars for managed-node1 15896 1727203911.13528: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203911.13538: Calling all_plugins_play to load vars for managed-node1 15896 1727203911.13541: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203911.13543: Calling groups_plugins_play to load vars for managed-node1 15896 1727203911.14437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203911.15301: done with get_vars() 15896 1727203911.15321: done getting variables 15896 1727203911.15381: done queuing things up, now waiting for results queue to drain 15896 1727203911.15382: results queue empty 15896 1727203911.15383: checking for any_errors_fatal 15896 1727203911.15385: done checking for any_errors_fatal 15896 1727203911.15385: checking for max_fail_percentage 15896 1727203911.15386: done checking for max_fail_percentage 15896 1727203911.15386: checking to see if all hosts have failed and the running result is not ok 15896 1727203911.15387: done checking to see if all hosts have failed 15896 1727203911.15387: getting the remaining hosts for this loop 15896 1727203911.15388: done getting the remaining hosts for this loop 15896 1727203911.15390: getting the next task for host managed-node1 15896 1727203911.15392: done getting next task for host managed-node1 15896 1727203911.15394: ^ task is: TASK: Delete the device '{{ controller_device }}' 15896 1727203911.15395: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203911.15397: getting variables 15896 1727203911.15398: in VariableManager get_vars() 15896 1727203911.15412: Calling all_inventory to load vars for managed-node1 15896 1727203911.15413: Calling groups_inventory to load vars for managed-node1 15896 1727203911.15414: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203911.15419: Calling all_plugins_play to load vars for managed-node1 15896 1727203911.15421: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203911.15423: Calling groups_plugins_play to load vars for managed-node1 15896 1727203911.16050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203911.20555: done with get_vars() 15896 1727203911.20574: done getting variables 15896 1727203911.20608: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15896 1727203911.20683: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.422) 0:00:56.796 ***** 15896 1727203911.20702: entering _queue_task() for managed-node1/command 15896 1727203911.21012: worker is 1 (out of 1 available) 15896 1727203911.21025: exiting _queue_task() for managed-node1/command 15896 1727203911.21040: done queuing things up, now waiting for results queue to drain 15896 1727203911.21042: waiting for pending results... 15896 1727203911.21397: running TaskExecutor() for managed-node1/TASK: Delete the device 'nm-bond' 15896 1727203911.21584: in run() - task 028d2410-947f-fb83-b6ad-0000000001b1 15896 1727203911.21589: variable 'ansible_search_path' from source: unknown 15896 1727203911.21592: calling self._execute() 15896 1727203911.21680: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203911.21697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203911.21716: variable 'omit' from source: magic vars 15896 1727203911.22131: variable 'ansible_distribution_major_version' from source: facts 15896 1727203911.22152: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203911.22169: variable 'omit' from source: magic vars 15896 1727203911.22198: variable 'omit' from source: magic vars 15896 1727203911.22307: variable 'controller_device' from source: play vars 15896 1727203911.22382: variable 'omit' from source: magic vars 15896 1727203911.22386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203911.22424: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203911.22450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203911.22477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203911.22502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203911.22541: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203911.22582: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203911.22586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203911.22682: Set connection var ansible_shell_type to sh 15896 1727203911.22698: Set connection var ansible_connection to ssh 15896 1727203911.22711: Set connection var ansible_shell_executable to /bin/sh 15896 1727203911.22782: Set connection var ansible_pipelining to False 15896 1727203911.22786: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203911.22788: Set connection var ansible_timeout to 10 15896 1727203911.22791: variable 'ansible_shell_executable' from source: unknown 15896 1727203911.22793: variable 'ansible_connection' from source: unknown 15896 1727203911.22796: variable 'ansible_module_compression' from source: unknown 15896 1727203911.22799: variable 'ansible_shell_type' from source: unknown 15896 1727203911.22801: variable 'ansible_shell_executable' from source: unknown 15896 1727203911.22803: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203911.22805: variable 'ansible_pipelining' from source: unknown 15896 1727203911.22807: variable 'ansible_timeout' from source: unknown 15896 1727203911.22816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203911.22958: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203911.23031: variable 'omit' from source: magic vars 15896 1727203911.23034: starting attempt loop 15896 1727203911.23037: running the handler 15896 1727203911.23039: _low_level_execute_command(): starting 15896 1727203911.23041: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203911.23818: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.23883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.23969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.25747: stdout chunk (state=3): >>>/root <<< 15896 1727203911.25847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.25889: stderr chunk (state=3): >>><<< 15896 1727203911.25892: stdout chunk (state=3): >>><<< 15896 1727203911.25906: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203911.25920: _low_level_execute_command(): starting 15896 1727203911.25926: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581 `" && echo ansible-tmp-1727203911.2590714-20423-72381655138581="` echo /root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581 `" ) && sleep 0' 15896 1727203911.26351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203911.26355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203911.26381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.26389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203911.26409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.26446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203911.26450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.26462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.26545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.28668: stdout chunk (state=3): >>>ansible-tmp-1727203911.2590714-20423-72381655138581=/root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581 <<< 15896 1727203911.28986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.28990: stdout chunk (state=3): >>><<< 15896 1727203911.28992: stderr chunk (state=3): >>><<< 15896 1727203911.28995: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203911.2590714-20423-72381655138581=/root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203911.28997: variable 'ansible_module_compression' from source: unknown 15896 1727203911.29000: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203911.29002: variable 'ansible_facts' from source: unknown 15896 1727203911.29082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/AnsiballZ_command.py 15896 1727203911.29214: Sending initial data 15896 1727203911.29218: Sent initial data (155 bytes) 15896 1727203911.29637: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203911.29641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203911.29667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203911.29671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.29673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203911.29692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.29742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203911.29746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.29751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.29829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.31586: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203911.31668: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203911.31766: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmplvaf8who /root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/AnsiballZ_command.py <<< 15896 1727203911.31769: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/AnsiballZ_command.py" <<< 15896 1727203911.31833: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmplvaf8who" to remote "/root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/AnsiballZ_command.py" <<< 15896 1727203911.32737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.32741: stdout chunk (state=3): >>><<< 15896 1727203911.32743: stderr chunk (state=3): >>><<< 15896 1727203911.32802: done transferring module to remote 15896 1727203911.32843: _low_level_execute_command(): starting 15896 1727203911.32847: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/ /root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/AnsiballZ_command.py && sleep 0' 15896 1727203911.33494: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203911.33597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.33634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203911.33654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.33673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.33789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.35807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.35845: stderr chunk (state=3): >>><<< 15896 1727203911.35866: stdout chunk (state=3): >>><<< 15896 1727203911.35960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203911.35963: _low_level_execute_command(): starting 15896 1727203911.35966: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/AnsiballZ_command.py && sleep 0' 15896 1727203911.36523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203911.36536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203911.36550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203911.36568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203911.36588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203911.36644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.36702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203911.36719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.36768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.36870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.54332: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:51:51.533530", "end": "2024-09-24 14:51:51.541461", "delta": "0:00:00.007931", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203911.56090: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. <<< 15896 1727203911.56117: stdout chunk (state=3): >>><<< 15896 1727203911.56120: stderr chunk (state=3): >>><<< 15896 1727203911.56140: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:51:51.533530", "end": "2024-09-24 14:51:51.541461", "delta": "0:00:00.007931", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. 15896 1727203911.56282: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203911.56287: _low_level_execute_command(): starting 15896 1727203911.56291: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203911.2590714-20423-72381655138581/ > /dev/null 2>&1 && sleep 0' 15896 1727203911.56821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203911.56893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.56943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203911.56961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.56987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.57103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.59174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.59180: stdout chunk (state=3): >>><<< 15896 1727203911.59183: stderr chunk (state=3): >>><<< 15896 1727203911.59382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203911.59386: handler run complete 15896 1727203911.59388: Evaluated conditional (False): False 15896 1727203911.59390: Evaluated conditional (False): False 15896 1727203911.59392: attempt loop complete, returning result 15896 1727203911.59394: _execute() done 15896 1727203911.59396: dumping result to json 15896 1727203911.59398: done dumping result, returning 15896 1727203911.59400: done running TaskExecutor() for managed-node1/TASK: Delete the device 'nm-bond' [028d2410-947f-fb83-b6ad-0000000001b1] 15896 1727203911.59402: sending task result for task 028d2410-947f-fb83-b6ad-0000000001b1 15896 1727203911.59473: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001b1 15896 1727203911.59478: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007931", "end": "2024-09-24 14:51:51.541461", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:51:51.533530" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 15896 1727203911.59547: no more pending results, returning what we have 15896 1727203911.59551: results queue empty 15896 1727203911.59552: checking for any_errors_fatal 15896 1727203911.59554: done checking for any_errors_fatal 15896 1727203911.59555: checking for max_fail_percentage 15896 1727203911.59556: done checking for max_fail_percentage 15896 1727203911.59557: checking to see if all hosts have failed and the running result is not ok 15896 1727203911.59558: done checking to see if all hosts have failed 15896 1727203911.59559: getting the remaining hosts for this loop 15896 1727203911.59560: done getting the remaining hosts for this loop 15896 1727203911.59563: getting the next task for host managed-node1 15896 1727203911.59573: done getting next task for host managed-node1 15896 1727203911.59581: ^ task is: TASK: Remove test interfaces 15896 1727203911.59584: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203911.59590: getting variables 15896 1727203911.59592: in VariableManager get_vars() 15896 1727203911.59646: Calling all_inventory to load vars for managed-node1 15896 1727203911.59649: Calling groups_inventory to load vars for managed-node1 15896 1727203911.59651: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203911.59663: Calling all_plugins_play to load vars for managed-node1 15896 1727203911.59666: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203911.59669: Calling groups_plugins_play to load vars for managed-node1 15896 1727203911.61306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203911.62920: done with get_vars() 15896 1727203911.62951: done getting variables 15896 1727203911.63015: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:51:51 -0400 (0:00:00.423) 0:00:57.219 ***** 15896 1727203911.63050: entering _queue_task() for managed-node1/shell 15896 1727203911.63439: worker is 1 (out of 1 available) 15896 1727203911.63451: exiting _queue_task() for managed-node1/shell 15896 1727203911.63465: done queuing things up, now waiting for results queue to drain 15896 1727203911.63467: waiting for pending results... 15896 1727203911.63892: running TaskExecutor() for managed-node1/TASK: Remove test interfaces 15896 1727203911.64014: in run() - task 028d2410-947f-fb83-b6ad-0000000001b5 15896 1727203911.64017: variable 'ansible_search_path' from source: unknown 15896 1727203911.64019: variable 'ansible_search_path' from source: unknown 15896 1727203911.64023: calling self._execute() 15896 1727203911.64085: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203911.64096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203911.64109: variable 'omit' from source: magic vars 15896 1727203911.64489: variable 'ansible_distribution_major_version' from source: facts 15896 1727203911.64507: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203911.64519: variable 'omit' from source: magic vars 15896 1727203911.64589: variable 'omit' from source: magic vars 15896 1727203911.64762: variable 'dhcp_interface1' from source: play vars 15896 1727203911.64783: variable 'dhcp_interface2' from source: play vars 15896 1727203911.64807: variable 'omit' from source: magic vars 15896 1727203911.64852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203911.64899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203911.64923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203911.64945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203911.64963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203911.65084: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203911.65089: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203911.65091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203911.65139: Set connection var ansible_shell_type to sh 15896 1727203911.65153: Set connection var ansible_connection to ssh 15896 1727203911.65209: Set connection var ansible_shell_executable to /bin/sh 15896 1727203911.65213: Set connection var ansible_pipelining to False 15896 1727203911.65215: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203911.65218: Set connection var ansible_timeout to 10 15896 1727203911.65222: variable 'ansible_shell_executable' from source: unknown 15896 1727203911.65231: variable 'ansible_connection' from source: unknown 15896 1727203911.65238: variable 'ansible_module_compression' from source: unknown 15896 1727203911.65244: variable 'ansible_shell_type' from source: unknown 15896 1727203911.65251: variable 'ansible_shell_executable' from source: unknown 15896 1727203911.65258: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203911.65266: variable 'ansible_pipelining' from source: unknown 15896 1727203911.65318: variable 'ansible_timeout' from source: unknown 15896 1727203911.65321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203911.65436: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203911.65454: variable 'omit' from source: magic vars 15896 1727203911.65465: starting attempt loop 15896 1727203911.65471: running the handler 15896 1727203911.65488: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203911.65512: _low_level_execute_command(): starting 15896 1727203911.65525: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203911.66299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.66380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.66414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.66499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.68342: stdout chunk (state=3): >>>/root <<< 15896 1727203911.68449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.68681: stderr chunk (state=3): >>><<< 15896 1727203911.68685: stdout chunk (state=3): >>><<< 15896 1727203911.68689: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203911.68692: _low_level_execute_command(): starting 15896 1727203911.68695: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354 `" && echo ansible-tmp-1727203911.685114-20445-154170167857354="` echo /root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354 `" ) && sleep 0' 15896 1727203911.69134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203911.69144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203911.69166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203911.69181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203911.69189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203911.69197: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203911.69208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.69223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203911.69230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203911.69274: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203911.69279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203911.69282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203911.69284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203911.69287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203911.69289: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203911.69291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.69353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203911.69383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.69386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.69491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.71644: stdout chunk (state=3): >>>ansible-tmp-1727203911.685114-20445-154170167857354=/root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354 <<< 15896 1727203911.71774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.71842: stderr chunk (state=3): >>><<< 15896 1727203911.71982: stdout chunk (state=3): >>><<< 15896 1727203911.71985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203911.685114-20445-154170167857354=/root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203911.71988: variable 'ansible_module_compression' from source: unknown 15896 1727203911.71990: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203911.72028: variable 'ansible_facts' from source: unknown 15896 1727203911.72135: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/AnsiballZ_command.py 15896 1727203911.72296: Sending initial data 15896 1727203911.72338: Sent initial data (155 bytes) 15896 1727203911.72993: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.73035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203911.73047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.73081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.73184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.74952: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15896 1727203911.74987: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203911.75057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203911.75153: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpc3alfumk /root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/AnsiballZ_command.py <<< 15896 1727203911.75156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/AnsiballZ_command.py" <<< 15896 1727203911.75227: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpc3alfumk" to remote "/root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/AnsiballZ_command.py" <<< 15896 1727203911.76116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.76223: stderr chunk (state=3): >>><<< 15896 1727203911.76226: stdout chunk (state=3): >>><<< 15896 1727203911.76238: done transferring module to remote 15896 1727203911.76256: _low_level_execute_command(): starting 15896 1727203911.76276: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/ /root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/AnsiballZ_command.py && sleep 0' 15896 1727203911.77089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.77122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.77235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203911.79233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203911.79267: stdout chunk (state=3): >>><<< 15896 1727203911.79271: stderr chunk (state=3): >>><<< 15896 1727203911.79369: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203911.79372: _low_level_execute_command(): starting 15896 1727203911.79378: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/AnsiballZ_command.py && sleep 0' 15896 1727203911.79923: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203911.79931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203911.79947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203911.79964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203911.79974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203911.79984: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203911.79994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.80009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203911.80053: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203911.80129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203911.80134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203911.80137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203911.80238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.00947: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:51:51.964447", "end": "2024-09-24 14:51:52.007754", "delta": "0:00:00.043307", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203912.02902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203912.02906: stdout chunk (state=3): >>><<< 15896 1727203912.02908: stderr chunk (state=3): >>><<< 15896 1727203912.02929: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:51:51.964447", "end": "2024-09-24 14:51:52.007754", "delta": "0:00:00.043307", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203912.03011: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203912.03014: _low_level_execute_command(): starting 15896 1727203912.03017: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203911.685114-20445-154170167857354/ > /dev/null 2>&1 && sleep 0' 15896 1727203912.03794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.03823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.03936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.06578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.06639: stderr chunk (state=3): >>><<< 15896 1727203912.06735: stdout chunk (state=3): >>><<< 15896 1727203912.06739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203912.06742: handler run complete 15896 1727203912.06744: Evaluated conditional (False): False 15896 1727203912.06881: attempt loop complete, returning result 15896 1727203912.06890: _execute() done 15896 1727203912.06898: dumping result to json 15896 1727203912.06908: done dumping result, returning 15896 1727203912.06921: done running TaskExecutor() for managed-node1/TASK: Remove test interfaces [028d2410-947f-fb83-b6ad-0000000001b5] 15896 1727203912.06977: sending task result for task 028d2410-947f-fb83-b6ad-0000000001b5 ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.043307", "end": "2024-09-24 14:51:52.007754", "rc": 0, "start": "2024-09-24 14:51:51.964447" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 15896 1727203912.07128: no more pending results, returning what we have 15896 1727203912.07132: results queue empty 15896 1727203912.07132: checking for any_errors_fatal 15896 1727203912.07143: done checking for any_errors_fatal 15896 1727203912.07144: checking for max_fail_percentage 15896 1727203912.07147: done checking for max_fail_percentage 15896 1727203912.07148: checking to see if all hosts have failed and the running result is not ok 15896 1727203912.07148: done checking to see if all hosts have failed 15896 1727203912.07149: getting the remaining hosts for this loop 15896 1727203912.07151: done getting the remaining hosts for this loop 15896 1727203912.07154: getting the next task for host managed-node1 15896 1727203912.07166: done getting next task for host managed-node1 15896 1727203912.07169: ^ task is: TASK: Stop dnsmasq/radvd services 15896 1727203912.07180: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203912.07186: getting variables 15896 1727203912.07192: in VariableManager get_vars() 15896 1727203912.07555: Calling all_inventory to load vars for managed-node1 15896 1727203912.07558: Calling groups_inventory to load vars for managed-node1 15896 1727203912.07564: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203912.07585: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001b5 15896 1727203912.07588: WORKER PROCESS EXITING 15896 1727203912.07598: Calling all_plugins_play to load vars for managed-node1 15896 1727203912.07699: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203912.07705: Calling groups_plugins_play to load vars for managed-node1 15896 1727203912.11181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203912.13315: done with get_vars() 15896 1727203912.13358: done getting variables 15896 1727203912.13406: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:51:52 -0400 (0:00:00.503) 0:00:57.723 ***** 15896 1727203912.13429: entering _queue_task() for managed-node1/shell 15896 1727203912.13690: worker is 1 (out of 1 available) 15896 1727203912.13705: exiting _queue_task() for managed-node1/shell 15896 1727203912.13717: done queuing things up, now waiting for results queue to drain 15896 1727203912.13719: waiting for pending results... 15896 1727203912.13911: running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services 15896 1727203912.14019: in run() - task 028d2410-947f-fb83-b6ad-0000000001b6 15896 1727203912.14029: variable 'ansible_search_path' from source: unknown 15896 1727203912.14033: variable 'ansible_search_path' from source: unknown 15896 1727203912.14063: calling self._execute() 15896 1727203912.14147: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203912.14151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203912.14163: variable 'omit' from source: magic vars 15896 1727203912.14434: variable 'ansible_distribution_major_version' from source: facts 15896 1727203912.14444: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203912.14451: variable 'omit' from source: magic vars 15896 1727203912.14493: variable 'omit' from source: magic vars 15896 1727203912.14516: variable 'omit' from source: magic vars 15896 1727203912.14547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203912.14577: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203912.14594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203912.14608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203912.14618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203912.14641: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203912.14644: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203912.14646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203912.14718: Set connection var ansible_shell_type to sh 15896 1727203912.14724: Set connection var ansible_connection to ssh 15896 1727203912.14727: Set connection var ansible_shell_executable to /bin/sh 15896 1727203912.14733: Set connection var ansible_pipelining to False 15896 1727203912.14738: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203912.14743: Set connection var ansible_timeout to 10 15896 1727203912.14760: variable 'ansible_shell_executable' from source: unknown 15896 1727203912.14763: variable 'ansible_connection' from source: unknown 15896 1727203912.14767: variable 'ansible_module_compression' from source: unknown 15896 1727203912.14769: variable 'ansible_shell_type' from source: unknown 15896 1727203912.14772: variable 'ansible_shell_executable' from source: unknown 15896 1727203912.14774: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203912.14780: variable 'ansible_pipelining' from source: unknown 15896 1727203912.14783: variable 'ansible_timeout' from source: unknown 15896 1727203912.14787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203912.14890: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203912.14898: variable 'omit' from source: magic vars 15896 1727203912.14903: starting attempt loop 15896 1727203912.14906: running the handler 15896 1727203912.14915: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203912.14933: _low_level_execute_command(): starting 15896 1727203912.14941: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203912.16152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.16213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203912.16433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.16437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.16513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.18331: stdout chunk (state=3): >>>/root <<< 15896 1727203912.18494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.18581: stderr chunk (state=3): >>><<< 15896 1727203912.18618: stdout chunk (state=3): >>><<< 15896 1727203912.18627: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203912.18644: _low_level_execute_command(): starting 15896 1727203912.18836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768 `" && echo ansible-tmp-1727203912.1862845-20462-166547477886768="` echo /root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768 `" ) && sleep 0' 15896 1727203912.19309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203912.19313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203912.19315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203912.19390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.19404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203912.19417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.19425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.19707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.21835: stdout chunk (state=3): >>>ansible-tmp-1727203912.1862845-20462-166547477886768=/root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768 <<< 15896 1727203912.21988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.21993: stdout chunk (state=3): >>><<< 15896 1727203912.21996: stderr chunk (state=3): >>><<< 15896 1727203912.22041: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203912.1862845-20462-166547477886768=/root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203912.22051: variable 'ansible_module_compression' from source: unknown 15896 1727203912.22110: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203912.22149: variable 'ansible_facts' from source: unknown 15896 1727203912.22437: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/AnsiballZ_command.py 15896 1727203912.23188: Sending initial data 15896 1727203912.23191: Sent initial data (156 bytes) 15896 1727203912.23799: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203912.23805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203912.23823: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203912.23830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.23834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203912.23891: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.23935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203912.23941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.23971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.24087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.26482: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203912.26495: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp6j5fcsez /root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/AnsiballZ_command.py <<< 15896 1727203912.26499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/AnsiballZ_command.py" <<< 15896 1727203912.26594: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp6j5fcsez" to remote "/root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/AnsiballZ_command.py" <<< 15896 1727203912.26670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/AnsiballZ_command.py" <<< 15896 1727203912.27856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.27860: stdout chunk (state=3): >>><<< 15896 1727203912.27871: stderr chunk (state=3): >>><<< 15896 1727203912.27936: done transferring module to remote 15896 1727203912.27947: _low_level_execute_command(): starting 15896 1727203912.27979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/ /root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/AnsiballZ_command.py && sleep 0' 15896 1727203912.29345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203912.29367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203912.29379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203912.29423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203912.29781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203912.29785: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203912.29787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.29789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203912.29792: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203912.29794: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203912.29795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203912.29797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203912.29799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203912.29806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203912.29808: stderr chunk (state=3): >>>debug2: match found <<< 15896 1727203912.29810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.30149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.30283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.30398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.32434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.32438: stdout chunk (state=3): >>><<< 15896 1727203912.32440: stderr chunk (state=3): >>><<< 15896 1727203912.32443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203912.32445: _low_level_execute_command(): starting 15896 1727203912.32447: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/AnsiballZ_command.py && sleep 0' 15896 1727203912.33773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203912.34174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203912.34183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.34188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.34223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.53993: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:51:52.509008", "end": "2024-09-24 14:51:52.537998", "delta": "0:00:00.028990", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203912.55924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203912.55928: stdout chunk (state=3): >>><<< 15896 1727203912.56141: stderr chunk (state=3): >>><<< 15896 1727203912.56145: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:51:52.509008", "end": "2024-09-24 14:51:52.537998", "delta": "0:00:00.028990", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203912.56154: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203912.56157: _low_level_execute_command(): starting 15896 1727203912.56163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203912.1862845-20462-166547477886768/ > /dev/null 2>&1 && sleep 0' 15896 1727203912.57096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203912.57111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203912.57128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203912.57191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.57251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203912.57268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.57310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.57419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.59516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.59519: stdout chunk (state=3): >>><<< 15896 1727203912.59521: stderr chunk (state=3): >>><<< 15896 1727203912.59536: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203912.59682: handler run complete 15896 1727203912.59685: Evaluated conditional (False): False 15896 1727203912.59687: attempt loop complete, returning result 15896 1727203912.59689: _execute() done 15896 1727203912.59691: dumping result to json 15896 1727203912.59693: done dumping result, returning 15896 1727203912.59695: done running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services [028d2410-947f-fb83-b6ad-0000000001b6] 15896 1727203912.59697: sending task result for task 028d2410-947f-fb83-b6ad-0000000001b6 15896 1727203912.59769: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001b6 15896 1727203912.59772: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.028990", "end": "2024-09-24 14:51:52.537998", "rc": 0, "start": "2024-09-24 14:51:52.509008" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 15896 1727203912.59846: no more pending results, returning what we have 15896 1727203912.59850: results queue empty 15896 1727203912.59851: checking for any_errors_fatal 15896 1727203912.59865: done checking for any_errors_fatal 15896 1727203912.59866: checking for max_fail_percentage 15896 1727203912.59869: done checking for max_fail_percentage 15896 1727203912.59870: checking to see if all hosts have failed and the running result is not ok 15896 1727203912.59871: done checking to see if all hosts have failed 15896 1727203912.59871: getting the remaining hosts for this loop 15896 1727203912.59873: done getting the remaining hosts for this loop 15896 1727203912.59981: getting the next task for host managed-node1 15896 1727203912.59991: done getting next task for host managed-node1 15896 1727203912.59995: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 15896 1727203912.59998: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203912.60003: getting variables 15896 1727203912.60005: in VariableManager get_vars() 15896 1727203912.60104: Calling all_inventory to load vars for managed-node1 15896 1727203912.60106: Calling groups_inventory to load vars for managed-node1 15896 1727203912.60109: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203912.60119: Calling all_plugins_play to load vars for managed-node1 15896 1727203912.60122: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203912.60129: Calling groups_plugins_play to load vars for managed-node1 15896 1727203912.61382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203912.62702: done with get_vars() 15896 1727203912.62729: done getting variables 15896 1727203912.62791: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Tuesday 24 September 2024 14:51:52 -0400 (0:00:00.493) 0:00:58.217 ***** 15896 1727203912.62822: entering _queue_task() for managed-node1/command 15896 1727203912.63130: worker is 1 (out of 1 available) 15896 1727203912.63146: exiting _queue_task() for managed-node1/command 15896 1727203912.63161: done queuing things up, now waiting for results queue to drain 15896 1727203912.63163: waiting for pending results... 15896 1727203912.63493: running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript 15896 1727203912.63542: in run() - task 028d2410-947f-fb83-b6ad-0000000001b7 15896 1727203912.63571: variable 'ansible_search_path' from source: unknown 15896 1727203912.63628: calling self._execute() 15896 1727203912.63752: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203912.63775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203912.63795: variable 'omit' from source: magic vars 15896 1727203912.64219: variable 'ansible_distribution_major_version' from source: facts 15896 1727203912.64246: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203912.64333: variable 'network_provider' from source: set_fact 15896 1727203912.64338: Evaluated conditional (network_provider == "initscripts"): False 15896 1727203912.64341: when evaluation is False, skipping this task 15896 1727203912.64344: _execute() done 15896 1727203912.64346: dumping result to json 15896 1727203912.64355: done dumping result, returning 15896 1727203912.64359: done running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript [028d2410-947f-fb83-b6ad-0000000001b7] 15896 1727203912.64364: sending task result for task 028d2410-947f-fb83-b6ad-0000000001b7 15896 1727203912.64453: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001b7 15896 1727203912.64456: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15896 1727203912.64508: no more pending results, returning what we have 15896 1727203912.64512: results queue empty 15896 1727203912.64513: checking for any_errors_fatal 15896 1727203912.64524: done checking for any_errors_fatal 15896 1727203912.64525: checking for max_fail_percentage 15896 1727203912.64527: done checking for max_fail_percentage 15896 1727203912.64528: checking to see if all hosts have failed and the running result is not ok 15896 1727203912.64528: done checking to see if all hosts have failed 15896 1727203912.64529: getting the remaining hosts for this loop 15896 1727203912.64531: done getting the remaining hosts for this loop 15896 1727203912.64534: getting the next task for host managed-node1 15896 1727203912.64540: done getting next task for host managed-node1 15896 1727203912.64543: ^ task is: TASK: Verify network state restored to default 15896 1727203912.64545: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203912.64550: getting variables 15896 1727203912.64551: in VariableManager get_vars() 15896 1727203912.64602: Calling all_inventory to load vars for managed-node1 15896 1727203912.64605: Calling groups_inventory to load vars for managed-node1 15896 1727203912.64607: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203912.64616: Calling all_plugins_play to load vars for managed-node1 15896 1727203912.64619: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203912.64621: Calling groups_plugins_play to load vars for managed-node1 15896 1727203912.65519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203912.66708: done with get_vars() 15896 1727203912.66729: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Tuesday 24 September 2024 14:51:52 -0400 (0:00:00.039) 0:00:58.257 ***** 15896 1727203912.66823: entering _queue_task() for managed-node1/include_tasks 15896 1727203912.67091: worker is 1 (out of 1 available) 15896 1727203912.67105: exiting _queue_task() for managed-node1/include_tasks 15896 1727203912.67117: done queuing things up, now waiting for results queue to drain 15896 1727203912.67119: waiting for pending results... 15896 1727203912.67308: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 15896 1727203912.67385: in run() - task 028d2410-947f-fb83-b6ad-0000000001b8 15896 1727203912.67397: variable 'ansible_search_path' from source: unknown 15896 1727203912.67423: calling self._execute() 15896 1727203912.67507: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203912.67510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203912.67520: variable 'omit' from source: magic vars 15896 1727203912.67786: variable 'ansible_distribution_major_version' from source: facts 15896 1727203912.67796: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203912.67802: _execute() done 15896 1727203912.67805: dumping result to json 15896 1727203912.67808: done dumping result, returning 15896 1727203912.67817: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [028d2410-947f-fb83-b6ad-0000000001b8] 15896 1727203912.67820: sending task result for task 028d2410-947f-fb83-b6ad-0000000001b8 15896 1727203912.67914: done sending task result for task 028d2410-947f-fb83-b6ad-0000000001b8 15896 1727203912.67918: WORKER PROCESS EXITING 15896 1727203912.67943: no more pending results, returning what we have 15896 1727203912.67947: in VariableManager get_vars() 15896 1727203912.68006: Calling all_inventory to load vars for managed-node1 15896 1727203912.68009: Calling groups_inventory to load vars for managed-node1 15896 1727203912.68011: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203912.68021: Calling all_plugins_play to load vars for managed-node1 15896 1727203912.68024: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203912.68026: Calling groups_plugins_play to load vars for managed-node1 15896 1727203912.68793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203912.70159: done with get_vars() 15896 1727203912.70174: variable 'ansible_search_path' from source: unknown 15896 1727203912.70187: we have included files to process 15896 1727203912.70188: generating all_blocks data 15896 1727203912.70189: done generating all_blocks data 15896 1727203912.70192: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15896 1727203912.70193: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15896 1727203912.70194: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15896 1727203912.70545: done processing included file 15896 1727203912.70548: iterating over new_blocks loaded from include file 15896 1727203912.70549: in VariableManager get_vars() 15896 1727203912.70580: done with get_vars() 15896 1727203912.70582: filtering new block on tags 15896 1727203912.70622: done filtering new block on tags 15896 1727203912.70625: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 15896 1727203912.70630: extending task lists for all hosts with included blocks 15896 1727203912.72002: done extending task lists 15896 1727203912.72003: done processing included files 15896 1727203912.72004: results queue empty 15896 1727203912.72005: checking for any_errors_fatal 15896 1727203912.72008: done checking for any_errors_fatal 15896 1727203912.72009: checking for max_fail_percentage 15896 1727203912.72010: done checking for max_fail_percentage 15896 1727203912.72011: checking to see if all hosts have failed and the running result is not ok 15896 1727203912.72011: done checking to see if all hosts have failed 15896 1727203912.72012: getting the remaining hosts for this loop 15896 1727203912.72013: done getting the remaining hosts for this loop 15896 1727203912.72020: getting the next task for host managed-node1 15896 1727203912.72026: done getting next task for host managed-node1 15896 1727203912.72029: ^ task is: TASK: Check routes and DNS 15896 1727203912.72032: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203912.72034: getting variables 15896 1727203912.72035: in VariableManager get_vars() 15896 1727203912.72056: Calling all_inventory to load vars for managed-node1 15896 1727203912.72059: Calling groups_inventory to load vars for managed-node1 15896 1727203912.72061: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203912.72070: Calling all_plugins_play to load vars for managed-node1 15896 1727203912.72072: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203912.72079: Calling groups_plugins_play to load vars for managed-node1 15896 1727203912.73149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203912.74159: done with get_vars() 15896 1727203912.74181: done getting variables 15896 1727203912.74237: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:51:52 -0400 (0:00:00.074) 0:00:58.331 ***** 15896 1727203912.74258: entering _queue_task() for managed-node1/shell 15896 1727203912.74581: worker is 1 (out of 1 available) 15896 1727203912.74594: exiting _queue_task() for managed-node1/shell 15896 1727203912.74605: done queuing things up, now waiting for results queue to drain 15896 1727203912.74607: waiting for pending results... 15896 1727203912.74801: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 15896 1727203912.74883: in run() - task 028d2410-947f-fb83-b6ad-0000000009f0 15896 1727203912.74896: variable 'ansible_search_path' from source: unknown 15896 1727203912.74899: variable 'ansible_search_path' from source: unknown 15896 1727203912.74926: calling self._execute() 15896 1727203912.75017: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203912.75021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203912.75030: variable 'omit' from source: magic vars 15896 1727203912.75318: variable 'ansible_distribution_major_version' from source: facts 15896 1727203912.75329: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203912.75334: variable 'omit' from source: magic vars 15896 1727203912.75368: variable 'omit' from source: magic vars 15896 1727203912.75395: variable 'omit' from source: magic vars 15896 1727203912.75428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203912.75457: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203912.75477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203912.75493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203912.75503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203912.75527: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203912.75530: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203912.75533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203912.75606: Set connection var ansible_shell_type to sh 15896 1727203912.75613: Set connection var ansible_connection to ssh 15896 1727203912.75618: Set connection var ansible_shell_executable to /bin/sh 15896 1727203912.75623: Set connection var ansible_pipelining to False 15896 1727203912.75628: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203912.75634: Set connection var ansible_timeout to 10 15896 1727203912.75650: variable 'ansible_shell_executable' from source: unknown 15896 1727203912.75653: variable 'ansible_connection' from source: unknown 15896 1727203912.75656: variable 'ansible_module_compression' from source: unknown 15896 1727203912.75658: variable 'ansible_shell_type' from source: unknown 15896 1727203912.75661: variable 'ansible_shell_executable' from source: unknown 15896 1727203912.75663: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203912.75668: variable 'ansible_pipelining' from source: unknown 15896 1727203912.75671: variable 'ansible_timeout' from source: unknown 15896 1727203912.75673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203912.75779: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203912.75783: variable 'omit' from source: magic vars 15896 1727203912.75788: starting attempt loop 15896 1727203912.75791: running the handler 15896 1727203912.75800: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203912.75816: _low_level_execute_command(): starting 15896 1727203912.75823: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203912.76354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203912.76358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.76361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203912.76363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.76406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203912.76415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.76430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.76518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.78284: stdout chunk (state=3): >>>/root <<< 15896 1727203912.78448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.78451: stdout chunk (state=3): >>><<< 15896 1727203912.78458: stderr chunk (state=3): >>><<< 15896 1727203912.78487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203912.78492: _low_level_execute_command(): starting 15896 1727203912.78503: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837 `" && echo ansible-tmp-1727203912.7846973-20490-191940249982837="` echo /root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837 `" ) && sleep 0' 15896 1727203912.79124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203912.79127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203912.79137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.79140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203912.79144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.79179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.79184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.79293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.81379: stdout chunk (state=3): >>>ansible-tmp-1727203912.7846973-20490-191940249982837=/root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837 <<< 15896 1727203912.81494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.81517: stderr chunk (state=3): >>><<< 15896 1727203912.81520: stdout chunk (state=3): >>><<< 15896 1727203912.81541: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203912.7846973-20490-191940249982837=/root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203912.81582: variable 'ansible_module_compression' from source: unknown 15896 1727203912.81623: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203912.81652: variable 'ansible_facts' from source: unknown 15896 1727203912.81711: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/AnsiballZ_command.py 15896 1727203912.81807: Sending initial data 15896 1727203912.81810: Sent initial data (156 bytes) 15896 1727203912.82255: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203912.82258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203912.82263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.82265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203912.82267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.82314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203912.82318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.82401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.84154: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203912.84228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203912.84333: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmp6x4qaa28 /root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/AnsiballZ_command.py <<< 15896 1727203912.84335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/AnsiballZ_command.py" <<< 15896 1727203912.84402: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmp6x4qaa28" to remote "/root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/AnsiballZ_command.py" <<< 15896 1727203912.84407: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/AnsiballZ_command.py" <<< 15896 1727203912.85240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.85291: stderr chunk (state=3): >>><<< 15896 1727203912.85294: stdout chunk (state=3): >>><<< 15896 1727203912.85331: done transferring module to remote 15896 1727203912.85340: _low_level_execute_command(): starting 15896 1727203912.85344: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/ /root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/AnsiballZ_command.py && sleep 0' 15896 1727203912.85916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203912.85920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15896 1727203912.85923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 15896 1727203912.85928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.86001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.86081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203912.88033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203912.88057: stderr chunk (state=3): >>><<< 15896 1727203912.88063: stdout chunk (state=3): >>><<< 15896 1727203912.88083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203912.88086: _low_level_execute_command(): starting 15896 1727203912.88091: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/AnsiballZ_command.py && sleep 0' 15896 1727203912.88550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203912.88581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203912.88584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 15896 1727203912.88586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15896 1727203912.88588: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203912.88590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203912.88646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203912.88649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203912.88651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203912.88748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203913.06151: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3131sec preferred_lft 3131sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:51:53.050456", "end": "2024-09-24 14:51:53.059718", "delta": "0:00:00.009262", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203913.07953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203913.07957: stdout chunk (state=3): >>><<< 15896 1727203913.07963: stderr chunk (state=3): >>><<< 15896 1727203913.08122: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3131sec preferred_lft 3131sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:51:53.050456", "end": "2024-09-24 14:51:53.059718", "delta": "0:00:00.009262", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203913.08133: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203913.08136: _low_level_execute_command(): starting 15896 1727203913.08139: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203912.7846973-20490-191940249982837/ > /dev/null 2>&1 && sleep 0' 15896 1727203913.08695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203913.08707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203913.08719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203913.08734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203913.08748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203913.08794: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203913.08859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203913.08891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203913.08906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203913.09098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203913.11106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203913.11116: stdout chunk (state=3): >>><<< 15896 1727203913.11128: stderr chunk (state=3): >>><<< 15896 1727203913.11148: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203913.11162: handler run complete 15896 1727203913.11196: Evaluated conditional (False): False 15896 1727203913.11212: attempt loop complete, returning result 15896 1727203913.11219: _execute() done 15896 1727203913.11225: dumping result to json 15896 1727203913.11235: done dumping result, returning 15896 1727203913.11248: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [028d2410-947f-fb83-b6ad-0000000009f0] 15896 1727203913.11256: sending task result for task 028d2410-947f-fb83-b6ad-0000000009f0 15896 1727203913.11491: done sending task result for task 028d2410-947f-fb83-b6ad-0000000009f0 15896 1727203913.11496: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009262", "end": "2024-09-24 14:51:53.059718", "rc": 0, "start": "2024-09-24 14:51:53.050456" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3131sec preferred_lft 3131sec inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 15896 1727203913.11572: no more pending results, returning what we have 15896 1727203913.11577: results queue empty 15896 1727203913.11578: checking for any_errors_fatal 15896 1727203913.11580: done checking for any_errors_fatal 15896 1727203913.11580: checking for max_fail_percentage 15896 1727203913.11582: done checking for max_fail_percentage 15896 1727203913.11583: checking to see if all hosts have failed and the running result is not ok 15896 1727203913.11584: done checking to see if all hosts have failed 15896 1727203913.11584: getting the remaining hosts for this loop 15896 1727203913.11586: done getting the remaining hosts for this loop 15896 1727203913.11589: getting the next task for host managed-node1 15896 1727203913.11595: done getting next task for host managed-node1 15896 1727203913.11598: ^ task is: TASK: Verify DNS and network connectivity 15896 1727203913.11600: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 15896 1727203913.11609: getting variables 15896 1727203913.11611: in VariableManager get_vars() 15896 1727203913.11819: Calling all_inventory to load vars for managed-node1 15896 1727203913.11823: Calling groups_inventory to load vars for managed-node1 15896 1727203913.11826: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203913.11837: Calling all_plugins_play to load vars for managed-node1 15896 1727203913.11840: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203913.11843: Calling groups_plugins_play to load vars for managed-node1 15896 1727203913.13328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203913.14885: done with get_vars() 15896 1727203913.14909: done getting variables 15896 1727203913.14970: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:51:53 -0400 (0:00:00.407) 0:00:58.739 ***** 15896 1727203913.15005: entering _queue_task() for managed-node1/shell 15896 1727203913.15346: worker is 1 (out of 1 available) 15896 1727203913.15359: exiting _queue_task() for managed-node1/shell 15896 1727203913.15373: done queuing things up, now waiting for results queue to drain 15896 1727203913.15577: waiting for pending results... 15896 1727203913.15665: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 15896 1727203913.15791: in run() - task 028d2410-947f-fb83-b6ad-0000000009f1 15896 1727203913.15815: variable 'ansible_search_path' from source: unknown 15896 1727203913.15823: variable 'ansible_search_path' from source: unknown 15896 1727203913.15866: calling self._execute() 15896 1727203913.15983: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203913.16018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203913.16022: variable 'omit' from source: magic vars 15896 1727203913.16397: variable 'ansible_distribution_major_version' from source: facts 15896 1727203913.16413: Evaluated conditional (ansible_distribution_major_version != '6'): True 15896 1727203913.16669: variable 'ansible_facts' from source: unknown 15896 1727203913.17586: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 15896 1727203913.17599: variable 'omit' from source: magic vars 15896 1727203913.17657: variable 'omit' from source: magic vars 15896 1727203913.17753: variable 'omit' from source: magic vars 15896 1727203913.18133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15896 1727203913.18137: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15896 1727203913.18164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15896 1727203913.18189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203913.18205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15896 1727203913.18243: variable 'inventory_hostname' from source: host vars for 'managed-node1' 15896 1727203913.18252: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203913.18260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203913.18372: Set connection var ansible_shell_type to sh 15896 1727203913.18468: Set connection var ansible_connection to ssh 15896 1727203913.18471: Set connection var ansible_shell_executable to /bin/sh 15896 1727203913.18474: Set connection var ansible_pipelining to False 15896 1727203913.18483: Set connection var ansible_module_compression to ZIP_DEFLATED 15896 1727203913.18489: Set connection var ansible_timeout to 10 15896 1727203913.18492: variable 'ansible_shell_executable' from source: unknown 15896 1727203913.18497: variable 'ansible_connection' from source: unknown 15896 1727203913.18503: variable 'ansible_module_compression' from source: unknown 15896 1727203913.18508: variable 'ansible_shell_type' from source: unknown 15896 1727203913.18510: variable 'ansible_shell_executable' from source: unknown 15896 1727203913.18714: variable 'ansible_host' from source: host vars for 'managed-node1' 15896 1727203913.18723: variable 'ansible_pipelining' from source: unknown 15896 1727203913.18726: variable 'ansible_timeout' from source: unknown 15896 1727203913.18733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 15896 1727203913.19443: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203913.19450: variable 'omit' from source: magic vars 15896 1727203913.19454: starting attempt loop 15896 1727203913.19456: running the handler 15896 1727203913.19459: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15896 1727203913.19466: _low_level_execute_command(): starting 15896 1727203913.19468: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15896 1727203913.20691: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203913.20795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203913.20905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203913.22670: stdout chunk (state=3): >>>/root <<< 15896 1727203913.22938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203913.22941: stdout chunk (state=3): >>><<< 15896 1727203913.22943: stderr chunk (state=3): >>><<< 15896 1727203913.22946: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203913.22948: _low_level_execute_command(): starting 15896 1727203913.22951: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551 `" && echo ansible-tmp-1727203913.2284708-20503-70193130023551="` echo /root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551 `" ) && sleep 0' 15896 1727203913.23582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203913.23599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203913.23620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203913.23647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203913.23731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203913.23767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203913.23785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203913.23807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203913.23959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203913.26085: stdout chunk (state=3): >>>ansible-tmp-1727203913.2284708-20503-70193130023551=/root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551 <<< 15896 1727203913.26195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203913.26227: stderr chunk (state=3): >>><<< 15896 1727203913.26229: stdout chunk (state=3): >>><<< 15896 1727203913.26284: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203913.2284708-20503-70193130023551=/root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203913.26287: variable 'ansible_module_compression' from source: unknown 15896 1727203913.26317: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15896019yzub2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15896 1727203913.26350: variable 'ansible_facts' from source: unknown 15896 1727203913.26409: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/AnsiballZ_command.py 15896 1727203913.26512: Sending initial data 15896 1727203913.26515: Sent initial data (155 bytes) 15896 1727203913.26990: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203913.27039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203913.27054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203913.27091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203913.27198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203913.28948: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15896 1727203913.29031: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15896 1727203913.29129: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15896019yzub2/tmpg0v0l2xb /root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/AnsiballZ_command.py <<< 15896 1727203913.29132: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/AnsiballZ_command.py" <<< 15896 1727203913.29205: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15896019yzub2/tmpg0v0l2xb" to remote "/root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/AnsiballZ_command.py" <<< 15896 1727203913.30281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203913.30284: stderr chunk (state=3): >>><<< 15896 1727203913.30286: stdout chunk (state=3): >>><<< 15896 1727203913.30288: done transferring module to remote 15896 1727203913.30290: _low_level_execute_command(): starting 15896 1727203913.30292: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/ /root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/AnsiballZ_command.py && sleep 0' 15896 1727203913.31343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203913.31593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203913.31710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203913.33704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203913.33708: stdout chunk (state=3): >>><<< 15896 1727203913.33714: stderr chunk (state=3): >>><<< 15896 1727203913.33731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203913.33734: _low_level_execute_command(): starting 15896 1727203913.33740: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/AnsiballZ_command.py && sleep 0' 15896 1727203913.34861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203913.35081: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203913.35094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15896 1727203913.35106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203913.35220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203913.90110: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 2320 0 --:--:-- --:--:-- --:--:-- 2328\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1316 0 --:--:-- --:--:-- --:--:-- 1322", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:51:53.515185", "end": "2024-09-24 14:51:53.899343", "delta": "0:00:00.384158", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15896 1727203913.91984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 15896 1727203913.91987: stdout chunk (state=3): >>><<< 15896 1727203913.91990: stderr chunk (state=3): >>><<< 15896 1727203913.92010: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 2320 0 --:--:-- --:--:-- --:--:-- 2328\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1316 0 --:--:-- --:--:-- --:--:-- 1322", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:51:53.515185", "end": "2024-09-24 14:51:53.899343", "delta": "0:00:00.384158", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 15896 1727203913.92150: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15896 1727203913.92154: _low_level_execute_command(): starting 15896 1727203913.92156: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203913.2284708-20503-70193130023551/ > /dev/null 2>&1 && sleep 0' 15896 1727203913.92700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15896 1727203913.92715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203913.92730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203913.92749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15896 1727203913.92768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 15896 1727203913.92781: stderr chunk (state=3): >>>debug2: match not found <<< 15896 1727203913.92795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203913.92812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15896 1727203913.92822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 15896 1727203913.92832: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15896 1727203913.92843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15896 1727203913.92857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15896 1727203913.92889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15896 1727203913.92952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 15896 1727203913.92978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15896 1727203913.93096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15896 1727203913.95132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15896 1727203913.95149: stdout chunk (state=3): >>><<< 15896 1727203913.95163: stderr chunk (state=3): >>><<< 15896 1727203913.95185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15896 1727203913.95196: handler run complete 15896 1727203913.95222: Evaluated conditional (False): False 15896 1727203913.95236: attempt loop complete, returning result 15896 1727203913.95242: _execute() done 15896 1727203913.95254: dumping result to json 15896 1727203913.95267: done dumping result, returning 15896 1727203913.95281: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [028d2410-947f-fb83-b6ad-0000000009f1] 15896 1727203913.95358: sending task result for task 028d2410-947f-fb83-b6ad-0000000009f1 15896 1727203913.95436: done sending task result for task 028d2410-947f-fb83-b6ad-0000000009f1 15896 1727203913.95439: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.384158", "end": "2024-09-24 14:51:53.899343", "rc": 0, "start": "2024-09-24 14:51:53.515185" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 2320 0 --:--:-- --:--:-- --:--:-- 2328 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1316 0 --:--:-- --:--:-- --:--:-- 1322 15896 1727203913.95517: no more pending results, returning what we have 15896 1727203913.95521: results queue empty 15896 1727203913.95522: checking for any_errors_fatal 15896 1727203913.95535: done checking for any_errors_fatal 15896 1727203913.95540: checking for max_fail_percentage 15896 1727203913.95542: done checking for max_fail_percentage 15896 1727203913.95543: checking to see if all hosts have failed and the running result is not ok 15896 1727203913.95544: done checking to see if all hosts have failed 15896 1727203913.95544: getting the remaining hosts for this loop 15896 1727203913.95546: done getting the remaining hosts for this loop 15896 1727203913.95549: getting the next task for host managed-node1 15896 1727203913.95559: done getting next task for host managed-node1 15896 1727203913.95564: ^ task is: TASK: meta (flush_handlers) 15896 1727203913.95566: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203913.95571: getting variables 15896 1727203913.95572: in VariableManager get_vars() 15896 1727203913.95627: Calling all_inventory to load vars for managed-node1 15896 1727203913.95630: Calling groups_inventory to load vars for managed-node1 15896 1727203913.95633: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203913.95644: Calling all_plugins_play to load vars for managed-node1 15896 1727203913.95647: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203913.95650: Calling groups_plugins_play to load vars for managed-node1 15896 1727203913.97770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203913.99607: done with get_vars() 15896 1727203913.99634: done getting variables 15896 1727203913.99714: in VariableManager get_vars() 15896 1727203913.99737: Calling all_inventory to load vars for managed-node1 15896 1727203913.99740: Calling groups_inventory to load vars for managed-node1 15896 1727203913.99742: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203913.99747: Calling all_plugins_play to load vars for managed-node1 15896 1727203913.99749: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203913.99752: Calling groups_plugins_play to load vars for managed-node1 15896 1727203914.02803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203914.06011: done with get_vars() 15896 1727203914.06049: done queuing things up, now waiting for results queue to drain 15896 1727203914.06052: results queue empty 15896 1727203914.06053: checking for any_errors_fatal 15896 1727203914.06057: done checking for any_errors_fatal 15896 1727203914.06058: checking for max_fail_percentage 15896 1727203914.06059: done checking for max_fail_percentage 15896 1727203914.06062: checking to see if all hosts have failed and the running result is not ok 15896 1727203914.06063: done checking to see if all hosts have failed 15896 1727203914.06064: getting the remaining hosts for this loop 15896 1727203914.06065: done getting the remaining hosts for this loop 15896 1727203914.06068: getting the next task for host managed-node1 15896 1727203914.06072: done getting next task for host managed-node1 15896 1727203914.06073: ^ task is: TASK: meta (flush_handlers) 15896 1727203914.06076: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203914.06080: getting variables 15896 1727203914.06081: in VariableManager get_vars() 15896 1727203914.06104: Calling all_inventory to load vars for managed-node1 15896 1727203914.06106: Calling groups_inventory to load vars for managed-node1 15896 1727203914.06108: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203914.06114: Calling all_plugins_play to load vars for managed-node1 15896 1727203914.06116: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203914.06119: Calling groups_plugins_play to load vars for managed-node1 15896 1727203914.08537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203914.10404: done with get_vars() 15896 1727203914.10433: done getting variables 15896 1727203914.10496: in VariableManager get_vars() 15896 1727203914.10527: Calling all_inventory to load vars for managed-node1 15896 1727203914.10530: Calling groups_inventory to load vars for managed-node1 15896 1727203914.10532: Calling all_plugins_inventory to load vars for managed-node1 15896 1727203914.10537: Calling all_plugins_play to load vars for managed-node1 15896 1727203914.10539: Calling groups_plugins_inventory to load vars for managed-node1 15896 1727203914.10542: Calling groups_plugins_play to load vars for managed-node1 15896 1727203914.12530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15896 1727203914.15989: done with get_vars() 15896 1727203914.16030: done queuing things up, now waiting for results queue to drain 15896 1727203914.16034: results queue empty 15896 1727203914.16035: checking for any_errors_fatal 15896 1727203914.16036: done checking for any_errors_fatal 15896 1727203914.16037: checking for max_fail_percentage 15896 1727203914.16038: done checking for max_fail_percentage 15896 1727203914.16039: checking to see if all hosts have failed and the running result is not ok 15896 1727203914.16040: done checking to see if all hosts have failed 15896 1727203914.16040: getting the remaining hosts for this loop 15896 1727203914.16041: done getting the remaining hosts for this loop 15896 1727203914.16045: getting the next task for host managed-node1 15896 1727203914.16048: done getting next task for host managed-node1 15896 1727203914.16049: ^ task is: None 15896 1727203914.16051: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15896 1727203914.16052: done queuing things up, now waiting for results queue to drain 15896 1727203914.16053: results queue empty 15896 1727203914.16053: checking for any_errors_fatal 15896 1727203914.16054: done checking for any_errors_fatal 15896 1727203914.16055: checking for max_fail_percentage 15896 1727203914.16056: done checking for max_fail_percentage 15896 1727203914.16056: checking to see if all hosts have failed and the running result is not ok 15896 1727203914.16057: done checking to see if all hosts have failed 15896 1727203914.16062: getting the next task for host managed-node1 15896 1727203914.16065: done getting next task for host managed-node1 15896 1727203914.16066: ^ task is: None 15896 1727203914.16067: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=109 changed=5 unreachable=0 failed=0 skipped=120 rescued=0 ignored=0 Tuesday 24 September 2024 14:51:54 -0400 (0:00:01.013) 0:00:59.752 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.18s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.16s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.07s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.06s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.99s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.92s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.91s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.37s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.33s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.23s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.20s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install dnsmasq --------------------------------------------------------- 1.09s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.04s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Verify DNS and network connectivity ------------------------------------- 1.01s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Check if system is ostree ----------------------------------------------- 1.00s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gather the minimum subset of ansible_facts required by the network role test --- 0.98s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.91s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.91s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.87s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 15896 1727203914.16427: RUNNING CLEANUP